MaxView

← Back to run

Log Summary

XPK Start: Sat Apr 25 20:47:59 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-25 20:48:24.633515: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0425 20:48:24.846178 135106378921792 max_utils.py:273] Attempting to initialize the jax distributed system...
I0425 20:48:33.886819 135106378921792 distributed.py:149] Starting JAX distributed service on [::]:8482
I0425 20:48:33.889190 135106378921792 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-w471p-slice-job-0-0.mt-13-scan-layers-false-w471p:8482
I0425 20:48:35.232175 135106378921792 max_utils.py:284] Jax distributed system initialized!
I0425 20:48:40.369496 135106378921792 max_utils.py:800] System Information: Jax Version: 0.9.2
I0425 20:48:40.369604 135106378921792 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0425 20:48:40.369645 135106378921792 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0425 20:48:40.369681 135106378921792 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0425 20:48:41.063331 135106378921792 maxtext_utils.py:1565] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0425 20:48:41.063616 135106378921792 checkpointing.py:677] Setting up checkpoint logger...
I0425 20:48:41.063668 135106378921792 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0425 20:48:41.063714 135106378921792 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0425 20:48:41.064077 135106378921792 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7ae060318440>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0425 20:48:43.980683 135106378921792 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0425 20:48:43.980943 135106378921792 checkpoint_manager.py:708] [process=3][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7acc906df860>}, handler_registry=None
I0425 20:48:43.981180 135106378921792 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7acc906df860>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0425 20:48:43.981228 135106378921792 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7adf2359fc80>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0425 20:48:43.981264 135106378921792 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7acc906df860>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7acc906df860>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7adf2359fc80>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7adf2359fc80>}).
I0425 20:48:43.981583 135106378921792 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0425 20:48:43.981654 135106378921792 async_checkpointer.py:192] [process=3][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7acc90503560> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0425 20:48:45.348479 135106378921792 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints
I0425 20:48:45.783745 135106378921792 checkpoint_manager.py:929] [process=3][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7acc906dd190>
I0425 20:48:45.783964 135106378921792 checkpointing.py:301] Checkpoint manager created!
I0425 20:48:46.733009 135106378921792 nnx_wrappers.py:453] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0425 20:48:46.733129 135106378921792 nnx_wrappers.py:453] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0425 20:48:47.115323 135106378921792 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0425 20:48:47.115420 135106378921792 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0425 20:48:47.131765 135106378921792 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0425 20:48:47.131826 135106378921792 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0425 20:48:47.162294 135106378921792 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0425 20:48:47.162366 135106378921792 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 20:48:47.178832 135106378921792 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0425 20:48:47.178896 135106378921792 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 20:48:47.195222 135106378921792 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0425 20:48:47.195282 135106378921792 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 20:48:47.225160 135106378921792 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0425 20:48:47.225231 135106378921792 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0425 20:48:47.247182 135106378921792 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0425 20:48:47.247250 135106378921792 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0425 20:48:50.186128 135106378921792 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0425 20:48:50.186260 135106378921792 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0425 20:48:57.498394 135106378921792 maxtext_utils.py:1668]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.498538 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.498592 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.498648 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.498687 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.498724 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.498778 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.498827 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.498867 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.498902 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.498947 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.498981 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499016 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.499047 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499077 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499109 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499140 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.499174 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499207 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499240 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499271 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499302 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.499332 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499361 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499392 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499423 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.499453 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499487 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499519 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499549 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499579 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.499608 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499638 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499667 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499697 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.499726 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499756 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499784 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499813 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.499842 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.499871 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499900 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.499949 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.499985 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.500017 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500050 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500080 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500108 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500137 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.500165 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500191 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500221 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500249 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.500279 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500308 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500337 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500365 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500394 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.500421 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500448 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500483 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500513 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.500542 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500571 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500601 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500630 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500659 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.500686 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500714 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500743 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500772 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.500801 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500830 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.500859 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500888 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.500918 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.500959 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.500988 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501017 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501047 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.501078 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501107 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501136 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501165 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501194 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.501221 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501248 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501280 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501310 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.501339 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501367 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501396 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501425 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501454 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.501488 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501516 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501544 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501573 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.501602 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501631 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501660 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501688 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501717 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.501744 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501770 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.501799 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501828 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.501857 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501888 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.501917 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501959 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.501988 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.502016 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502043 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502072 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502100 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.502129 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502157 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502187 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.502215 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.502244 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.502271 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502298 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502326 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502354 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.502382 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502411 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502440 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.502468 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.502501 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.502528 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502555 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502585 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502614 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.502642 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502671 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502700 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.502728 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0425 20:48:57.502757 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0425 20:48:57.502783 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502810 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0425 20:48:57.502839 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502868 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0425 20:48:57.502897 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502939 135106378921792 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0425 20:48:57.502989 135106378921792 maxtext_utils.py:1668]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)

I0425 20:48:57.503032 135106378921792 maxtext_utils.py:1668]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0425 20:49:02.058582 135106378921792 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0425 20:49:02.058676 135106378921792 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0425 20:49:02.074103 135106378921792 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0425 20:49:02.074162 135106378921792 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0425 20:49:59.041019 135106378921792 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0425 20:49:59.042208 135106378921792 metric_logger.py:301] number parameters: 1.104 billion
I0425 20:51:00.344890 135106378921792 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0425 20:51:00.594172 135106378921792 checkpointing.py:776] Waited 0.24926090240478516 seconds for step 0 to finish before starting checkpointing.
I0425 20:51:00.598872 135106378921792 checkpoint_manager.py:2009] [process=3][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0425 20:51:00.602123 135106378921792 checkpoint_manager.py:1512] [process=3] Saving checkpoint at step 0
I0425 20:51:00.603749 135106378921792 event_tracking.py:70] [process=3] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints/0.
I0425 20:51:00.974038 135106378921792 signaling_client.py:364] Using JaxDistributedSignalingClient
I0425 20:51:00.975060 135106378921792 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0425 20:51:00.975184 135106378921792 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0425 20:51:01.273464 135106378921792 base_pytree_checkpoint_handler.py:154] [process=3][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.299954s
I0425 20:51:01.273637 135106378921792 base_pytree_checkpoint_handler.py:130] [process=3] /jax/orbax/write/blocking_gbytes_per_sec: 4.609 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3347017765045166 s) (per-host)
I0425 20:51:01.273695 135106378921792 base_pytree_checkpoint_handler.py:768] [process=3][thread=MainThread] Initiated Pytree async_save. Time taken: 0.334769s (batch_requests_ready=0.017664s, total_serialization_initiated=0.317030s, others=0.000075s)
I0425 20:51:01.273975 135106378921792 composite_checkpoint_handler.py:715] [process=3][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.339079s (all_items=0.000018s, per_item={'items': '0.00001812'}, temp_paths=0.339061)
I0425 20:51:01.274838 135106378921792 event_tracking.py:125] [process=3] [async] Finished blocking save in 0.67 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints/0.
I0425 20:51:01.275222 134980857935616 async_checkpointer.py:76] [process=3][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-25 21:11:01.275183
I0425 20:51:01.328358 135106378921792 checkpoint_manager.py:1560] [process=3][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0425 20:51:01.328718 134980831708928 async_checkpointer.py:280] [process=3][thread=save_finalize] Waiting for background save thread=async_save.
I0425 20:51:01.328883 135106378921792 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777150260.5988536, 'wait_for_prev_duration_secs': 6.127357482910156e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777150260.6021614, 'checkpointer_blocking_duration_secs': 0.6732203960418701, 'get_old_steps_start_time': 1777150261.2754078, 'get_old_steps_duration_secs': 2.86102294921875e-05, 'checkpoint_manager_blocking_start_time': 1777150260.5960424, 'checkpoint_manager_blocking_duration_secs': 0.7327990531921387}
I0425 20:51:01.329098 135106378921792 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0425 20:51:01.329154 135106378921792 max_utils.py:750] 
Memstats: After params initialized:
I0425 20:51:01.329209 135106378921792 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_10(process=3,(2,2,0,0))
I0425 20:51:01.329242 135106378921792 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_11(process=3,(3,2,0,0))
I0425 20:51:01.329269 135106378921792 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_14(process=3,(2,3,0,0))
I0425 20:51:01.329294 135106378921792 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_15(process=3,(3,3,0,0))
I0425 20:51:01.685687 135106378921792 metric_logger.py:196] completed step: 0, seconds: 61.303, TFLOP/s/device: 0.222, Tokens/s/device: 33.408, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0425 20:51:01.837285 135106378921792 metric_logger.py:196] completed step: 1, seconds: 1.332, TFLOP/s/device: 10.201, Tokens/s/device: 1537.631, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0425 20:51:02.291344 135106378921792 metric_logger.py:196] completed step: 2, seconds: 0.024, TFLOP/s/device: 558.612, Tokens/s/device: 84200.140, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28796.281
I0425 20:51:02.427743 135106378921792 metric_logger.py:196] completed step: 3, seconds: 0.455, TFLOP/s/device: 29.886, Tokens/s/device: 4504.762, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16999.936
I0425 20:51:02.707996 135106378921792 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.151, Tokens/s/device: 14342.138, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10778.760
I0425 20:51:02.720049 135106378921792 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.639, Tokens/s/device: 15018.627, total_weights: 65536, loss: 8.900, lm_loss: 8.900, perplexity: 7335.285
I0425 20:51:04.895342    2776 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0425 20:51:07.496094 134980840101632 array_metadata_store.py:203] [process=3][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_3
I0425 20:51:27.925948 135106378921792 metric_logger.py:196] completed step: 6, seconds: 0.281, TFLOP/s/device: 48.333, Tokens/s/device: 7285.327, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5441.451
I0425 20:51:28.066003 135106378921792 metric_logger.py:196] completed step: 7, seconds: 25.075, TFLOP/s/device: 0.542, Tokens/s/device: 81.676, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.143
I0425 20:51:28.202151 135106378921792 metric_logger.py:196] completed step: 8, seconds: 0.143, TFLOP/s/device: 95.273, Tokens/s/device: 14360.643, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.886
I0425 20:51:28.338140 135106378921792 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0425 20:51:28.341659 135106378921792 checkpointing.py:776] Waited 0.003537893295288086 seconds for step 9 to finish before starting checkpointing.
I0425 20:51:28.346298 135106378921792 checkpoint_manager.py:2020] [process=3][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0425 20:51:34.262700 134980857935616 base_pytree_checkpoint_handler.py:130] [process=3] /jax/orbax/write/gbytes_per_sec: 47.402 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.32370734214783 s) (per-host)
I0425 20:51:34.262832 134980857935616 async_checkpointer.py:90] [process=3][thread=async_save] 3 Handler Commit operations completed. Time taken: 32.987495s.
I0425 20:51:44.243271 134980857935616 async_checkpointer.py:160] [process=3][thread=async_save] Background save thread done. Time taken: 42.967917s.
I0425 20:51:44.243587 134980831708928 async_checkpointer.py:288] [process=3][thread=save_finalize] Done with waiting for background save thread=async_save.
I0425 20:51:44.243739 134980831708928 async_checkpointer.py:298] [process=3][thread=save_finalize] No errors found in background save thread=async_save.
I0425 20:51:44.243795 134980831708928 checkpoint_manager.py:2137] [process=3][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0425 20:51:44.245418 134980831708928 checkpoint_manager.py:2146] [process=3][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0425 20:51:44.245621 135106378921792 checkpoint_manager.py:2032] [process=3][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0425 20:51:44.245788 135106378921792 checkpoint_manager.py:1452] Waiting for previous save to complete took 15.899492 seconds. If this number is high, consider checkpointing less frequently.
I0425 20:51:44.248430 135106378921792 checkpoint_manager.py:1512] [process=3] Saving checkpoint at step 9
I0425 20:51:44.250561 135106378921792 event_tracking.py:70] [process=3] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints/9.
I0425 20:51:44.568980 135106378921792 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0425 20:51:44.569152 135106378921792 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0425 20:51:44.691519 135106378921792 base_pytree_checkpoint_handler.py:154] [process=3][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.124008s
I0425 20:51:44.691676 135106378921792 base_pytree_checkpoint_handler.py:130] [process=3] /jax/orbax/write/blocking_gbytes_per_sec: 9.811 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.1572248935699463 s) (per-host)
I0425 20:51:44.691725 135106378921792 base_pytree_checkpoint_handler.py:768] [process=3][thread=MainThread] Initiated Pytree async_save. Time taken: 0.157283s (batch_requests_ready=0.017523s, total_serialization_initiated=0.139695s, others=0.000065s)
I0425 20:51:44.692025 135106378921792 composite_checkpoint_handler.py:715] [process=3][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.161552s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=0.161535)
I0425 20:51:44.692796 135106378921792 event_tracking.py:125] [process=3] [async] Finished blocking save in 0.44 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints/9.
I0425 20:51:44.693214 134980831708928 async_checkpointer.py:76] [process=3][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-25 21:11:44.693169
I0425 20:51:44.700992 135106378921792 checkpoint_manager.py:1560] [process=3][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0425 20:51:44.701286 134980326827776 async_checkpointer.py:280] [process=3][thread=save_finalize] Waiting for background save thread=async_save.
I0425 20:51:44.701448 135106378921792 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777150288.3462672, 'wait_for_prev_duration_secs': 15.899491548538208, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777150304.2484713, 'checkpointer_blocking_duration_secs': 0.44493770599365234, 'get_old_steps_start_time': 1777150304.693435, 'get_old_steps_duration_secs': 3.075599670410156e-05, 'checkpoint_manager_blocking_start_time': 1777150288.3426082, 'checkpoint_manager_blocking_duration_secs': 16.358805179595947}
I0425 20:51:44.701656 135106378921792 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0425 20:51:44.701703 135106378921792 checkpoint_manager.py:2020] [process=3][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0425 20:51:50.673218 134980840101632 array_metadata_store.py:203] [process=3][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260425_201227/linen_xpk_test_pipeline_scan_nnx_20260425_201227_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_3
I0425 20:52:27.496266 134980831708928 base_pytree_checkpoint_handler.py:130] [process=3] /jax/orbax/write/gbytes_per_sec: 36.767 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.961771965026855 s) (per-host)
I0425 20:52:27.496396 134980831708928 async_checkpointer.py:90] [process=3][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.803031s.
I0425 20:52:36.613482 134980831708928 async_checkpointer.py:160] [process=3][thread=async_save] Background save thread done. Time taken: 51.920101s.
I0425 20:52:36.613811 134980326827776 async_checkpointer.py:288] [process=3][thread=save_finalize] Done with waiting for background save thread=async_save.
I0425 20:52:36.613968 134980326827776 async_checkpointer.py:298] [process=3][thread=save_finalize] No errors found in background save thread=async_save.
I0425 20:52:36.614020 134980326827776 checkpoint_manager.py:2137] [process=3][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0425 20:52:36.615373 134980326827776 checkpoint_manager.py:2146] [process=3][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0425 20:52:36.615569 135106378921792 checkpoint_manager.py:2032] [process=3][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0425 20:52:36.615736 135106378921792 checkpoint_manager.py:2009] [process=3][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0425 20:52:36.616569 135106378921792 metric_logger.py:196] completed step: 9, seconds: 0.140, TFLOP/s/device: 97.036, Tokens/s/device: 14626.377, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3599.214
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Sat Apr 25 20:52:49 UTC 2026
EXIT_CODE=0