MaxView

← Back to run

Log Summary

XPK Start: Sat Apr 25 12:47:10 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-25 12:47:35.472535: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0425 12:47:35.680823 137845388715840 max_utils.py:273] Attempting to initialize the jax distributed system...
I0425 12:47:44.722211 137845388715840 distributed.py:149] Starting JAX distributed service on [::]:8482
I0425 12:47:44.724813 137845388715840 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-k8tlu-slice-job-0-0.mt-13-scan-layers-false-k8tlu:8482
I0425 12:47:47.540630 137845388715840 max_utils.py:284] Jax distributed system initialized!
I0425 12:47:53.606991 137845388715840 max_utils.py:800] System Information: Jax Version: 0.9.2
I0425 12:47:53.607101 137845388715840 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0425 12:47:53.607141 137845388715840 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0425 12:47:53.607176 137845388715840 train_utils.py:391] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0425 12:47:54.305140 137845388715840 maxtext_utils.py:1771] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0425 12:47:54.305734 137845388715840 maxtext_utils.py:1771] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0425 12:47:54.305915 137845388715840 checkpointing.py:688] Setting up checkpoint logger...
I0425 12:47:54.305966 137845388715840 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0425 12:47:54.306014 137845388715840 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0425 12:47:54.306348 137845388715840 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7d5e10bea6c0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0425 12:47:57.202300 137845388715840 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0425 12:47:57.202541 137845388715840 checkpoint_manager.py:708] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d49f810eb40>}, handler_registry=None
I0425 12:47:57.202795 137845388715840 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d49f810eb40>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0425 12:47:57.202860 137845388715840 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d49f81120c0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0425 12:47:57.202907 137845388715840 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d49f810eb40>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d49f810eb40>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d49f81120c0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d49f81120c0>}).
I0425 12:47:57.203249 137845388715840 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0425 12:47:57.203323 137845388715840 async_checkpointer.py:192] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7d48d8481e40> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0425 12:47:57.908338 137845388715840 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints
I0425 12:47:57.961826 137845388715840 checkpoint_manager.py:929] [process=5][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7d49f810f560>
I0425 12:47:57.961976 137845388715840 checkpointing.py:302] Checkpoint manager created!
I0425 12:47:58.901810 137845388715840 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0425 12:47:58.901917 137845388715840 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0425 12:47:59.282283 137845388715840 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0425 12:47:59.282376 137845388715840 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0425 12:47:59.298517 137845388715840 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0425 12:47:59.298577 137845388715840 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0425 12:47:59.327843 137845388715840 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0425 12:47:59.327914 137845388715840 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 12:47:59.344430 137845388715840 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0425 12:47:59.344493 137845388715840 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 12:47:59.360749 137845388715840 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0425 12:47:59.360805 137845388715840 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 12:47:59.389159 137845388715840 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0425 12:47:59.389230 137845388715840 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 12:47:59.410909 137845388715840 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0425 12:47:59.410974 137845388715840 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0425 12:48:02.337477 137845388715840 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0425 12:48:02.337613 137845388715840 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0425 12:48:09.656449 137845388715840 maxtext_utils.py:1880]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.656587 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.656641 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.656714 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.656755 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.656790 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.656840 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.656888 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.656931 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.656966 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657000 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657035 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657068 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.657099 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657130 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657163 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657197 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.657230 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657261 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657293 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657324 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657354 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.657383 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657411 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657442 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657473 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.657505 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657535 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657565 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657594 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657624 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.657662 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657694 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657725 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657755 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.657786 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657816 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.657846 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657876 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.657906 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.657940 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.657970 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658002 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658038 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.658069 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658098 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658128 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.658162 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.658211 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.658261 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658295 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658328 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658359 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.658389 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658419 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658450 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.658480 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.658520 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.658555 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658584 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658613 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658643 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.658707 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658741 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658771 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.658801 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.658830 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.658857 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658884 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.658915 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.658945 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.658974 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659003 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659033 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659064 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659093 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.659121 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659148 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659179 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659210 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.659240 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659270 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659299 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659328 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659356 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.659383 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659410 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659438 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659467 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.659495 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659524 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659553 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659581 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659610 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.659636 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659674 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659705 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659734 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.659764 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659793 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.659821 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659851 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.659879 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.659908 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659943 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.659975 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660006 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.660036 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660066 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660095 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660126 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660156 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.660182 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660209 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660238 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660266 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.660295 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660324 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660352 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660380 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660410 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.660437 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660464 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660492 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660521 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.660550 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660578 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660606 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660635 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660674 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.660703 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660730 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660759 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660788 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.660816 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660844 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.660873 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660901 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 12:48:09.660932 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 12:48:09.660961 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.660987 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 12:48:09.661015 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.661044 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 12:48:09.661072 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.661101 137845388715840 maxtext_utils.py:1880]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 12:48:09.661150 137845388715840 maxtext_utils.py:1880]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)

I0425 12:48:09.661195 137845388715840 maxtext_utils.py:1880]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0425 12:48:14.235663 137845388715840 train.py:157] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0425 12:48:14.235756 137845388715840 train.py:157] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0425 12:48:14.251172 137845388715840 train.py:164] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0425 12:48:14.251234 137845388715840 train.py:164] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0425 12:49:10.997792 137845388715840 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0425 12:49:10.998969 137845388715840 metric_logger.py:301] number parameters: 1.104 billion
I0425 12:50:11.849077 137845388715840 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0425 12:50:12.101741 137845388715840 checkpointing.py:798] Waited 0.25264644622802734 seconds for step 0 to finish before starting checkpointing.
I0425 12:50:12.105108 137845388715840 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0425 12:50:12.107161 137845388715840 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 0
I0425 12:50:12.108947 137845388715840 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints/0.
I0425 12:50:12.438355 137845388715840 signaling_client.py:364] Using JaxDistributedSignalingClient
I0425 12:50:12.439373 137845388715840 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0425 12:50:12.439500 137845388715840 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0425 12:50:12.734194 137845388715840 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.296341s
I0425 12:50:12.734360 137845388715840 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 4.658 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.331143856048584 s) (per-host)
I0425 12:50:12.734411 137845388715840 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.331204s (batch_requests_ready=0.018240s, total_serialization_initiated=0.312896s, others=0.000067s)
I0425 12:50:12.734668 137845388715840 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.335456s (all_items=0.000018s, per_item={'items': '0.00001812'}, temp_paths=0.335438)
I0425 12:50:12.735460 137845388715840 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.63 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints/0.
I0425 12:50:12.735815 137717180708608 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-25 13:10:12.735777
I0425 12:50:12.801068 137845388715840 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0425 12:50:12.801446 137717151328000 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0425 12:50:12.801609 137845388715840 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777121412.1050901, 'wait_for_prev_duration_secs': 6.198883056640625e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777121412.107251, 'checkpointer_blocking_duration_secs': 0.628730297088623, 'get_old_steps_start_time': 1777121412.7360072, 'get_old_steps_duration_secs': 2.9325485229492188e-05, 'checkpoint_manager_blocking_start_time': 1777121412.1033607, 'checkpoint_manager_blocking_duration_secs': 0.6982080936431885}
I0425 12:50:12.801830 137845388715840 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0425 12:50:12.801885 137845388715840 max_utils.py:750] 
Memstats: After params initialized:
I0425 12:50:12.801944 137845388715840 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_18(process=5,(2,4,0,0))
I0425 12:50:12.801978 137845388715840 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_19(process=5,(3,4,0,0))
I0425 12:50:12.802003 137845388715840 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_22(process=5,(2,5,0,0))
I0425 12:50:12.802028 137845388715840 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_23(process=5,(3,5,0,0))
I0425 12:50:13.152987 137845388715840 metric_logger.py:196] completed step: 0, seconds: 60.850, TFLOP/s/device: 0.223, Tokens/s/device: 33.657, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0425 12:50:13.309063 137845388715840 metric_logger.py:196] completed step: 1, seconds: 1.295, TFLOP/s/device: 10.489, Tokens/s/device: 1580.947, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0425 12:50:13.735559 137845388715840 metric_logger.py:196] completed step: 2, seconds: 0.028, TFLOP/s/device: 485.237, Tokens/s/device: 73140.245, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28798.279
I0425 12:50:13.874826 137845388715840 metric_logger.py:196] completed step: 3, seconds: 0.427, TFLOP/s/device: 31.831, Tokens/s/device: 4797.961, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16999.252
I0425 12:50:14.155458 137845388715840 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 94.835, Tokens/s/device: 14294.489, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10779.387
I0425 12:50:14.167708 137845388715840 metric_logger.py:196] completed step: 5, seconds: 0.139, TFLOP/s/device: 97.727, Tokens/s/device: 14730.422, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7335.804
I0425 12:50:15.852085    2809 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0425 12:50:17.949644 137717159720704 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_5
I0425 12:50:38.194907 137845388715840 metric_logger.py:196] completed step: 6, seconds: 0.282, TFLOP/s/device: 48.116, Tokens/s/device: 7252.536, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.671
I0425 12:50:38.332424 137845388715840 metric_logger.py:196] completed step: 7, seconds: 23.896, TFLOP/s/device: 0.569, Tokens/s/device: 85.705, total_weights: 65536, loss: 8.394, lm_loss: 8.394, perplexity: 4418.666
I0425 12:50:38.468689 137845388715840 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.527, Tokens/s/device: 14398.808, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.663
I0425 12:50:38.604053 137845388715840 checkpointing.py:794] Waiting for step 9 to finish before checkpoint...
I0425 12:50:38.607620 137845388715840 checkpointing.py:798] Waited 0.003586292266845703 seconds for step 9 to finish before starting checkpointing.
I0425 12:50:38.611477 137845388715840 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0425 12:50:46.529753 137717180708608 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 46.287 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 34.1264762878418 s) (per-host)
I0425 12:50:46.529883 137717180708608 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.793950s.
I0425 12:50:54.683811 137717180708608 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 41.947862s.
I0425 12:50:54.684082 137717151328000 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0425 12:50:54.684197 137717151328000 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0425 12:50:54.684248 137717151328000 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0425 12:50:54.685931 137717151328000 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0425 12:50:54.686106 137845388715840 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0425 12:50:54.686243 137845388715840 checkpoint_manager.py:1452] Waiting for previous save to complete took 16.074763 seconds. If this number is high, consider checkpointing less frequently.
I0425 12:50:54.688746 137845388715840 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 9
I0425 12:50:54.690843 137845388715840 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints/9.
I0425 12:50:55.004861 137845388715840 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0425 12:50:55.005478 137845388715840 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0425 12:50:55.130075 137845388715840 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.126739s
I0425 12:50:55.130236 137845388715840 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 9.647 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.15990591049194336 s) (per-host)
I0425 12:50:55.130287 137845388715840 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.159966s (batch_requests_ready=0.017452s, total_serialization_initiated=0.142446s, others=0.000067s)
I0425 12:50:55.130549 137845388715840 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.164221s (all_items=0.000015s, per_item={'items': '0.00001502'}, temp_paths=0.164206)
I0425 12:50:55.131301 137845388715840 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.44 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints/9.
I0425 12:50:55.131777 137717151328000 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-25 13:10:55.131739
I0425 12:50:55.138142 137845388715840 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0425 12:50:55.138369 137716641556224 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0425 12:50:55.138496 137845388715840 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777121438.611447, 'wait_for_prev_duration_secs': 16.07476282119751, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777121454.6887853, 'checkpointer_blocking_duration_secs': 0.44313549995422363, 'get_old_steps_start_time': 1777121455.1319454, 'get_old_steps_duration_secs': 3.528594970703125e-05, 'checkpoint_manager_blocking_start_time': 1777121438.6084914, 'checkpoint_manager_blocking_duration_secs': 16.52997064590454}
I0425 12:50:55.138713 137845388715840 checkpointing.py:409] Started an asynchronous checkpoint save for step 9
I0425 12:50:55.138764 137845388715840 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0425 12:51:00.256870 137717159720704 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260425_121355/linen_xpk_feat_nnx_post_train_fixes_20260425_121355_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_5
I0425 12:51:36.274757 137717151328000 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 38.243 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 41.304378271102905 s) (per-host)
I0425 12:51:36.274888 137717151328000 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.142996s.
I0425 12:51:45.754837 137717151328000 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 50.622928s.
I0425 12:51:45.755138 137716641556224 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0425 12:51:45.755268 137716641556224 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0425 12:51:45.755334 137716641556224 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0425 12:51:45.758969 137716641556224 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0425 12:51:45.759168 137845388715840 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0425 12:51:45.759325 137845388715840 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0425 12:51:45.760185 137845388715840 metric_logger.py:196] completed step: 9, seconds: 0.138, TFLOP/s/device: 98.770, Tokens/s/device: 14887.724, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.738
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Sat Apr 25 12:51:55 UTC 2026
EXIT_CODE=0