MaxView

← Back to run

Log Summary

XPK Start: Sun Apr 19 03:59:20 UTC 2026
2026-04-19 03:59:24.359947: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1776571164.372555      10 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1776571164.376268      10 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1776571164.387524      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776571164.387542      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776571164.387545      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776571164.387547      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-19 03:59:43.521501: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0419 03:59:44.051907 133489086924608 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-19 03:59:53,091:jax._src.distributed:140: Starting JAX distributed service on [::]:8482
I0419 03:59:53.091367 133489086924608 distributed.py:140] Starting JAX distributed service on [::]:8482
INFO:2026-04-19 03:59:53,093:jax._src.distributed:157: Connecting to JAX distributed service on mt-13-scan-layers-false-ox7ic-slice-job-0-0.mt-13-scan-layers-false-ox7ic:8482
I0419 03:59:53.093604 133489086924608 distributed.py:157] Connecting to JAX distributed service on mt-13-scan-layers-false-ox7ic-slice-job-0-0.mt-13-scan-layers-false-ox7ic:8482
I0419 03:59:54.734314 133489086924608 max_utils.py:284] Jax distributed system initialized!
I0419 04:00:00.462747 133489086924608 max_utils.py:800] System Information: Jax Version: 0.8.1
I0419 04:00:00.462855 133489086924608 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0419 04:00:00.462896 133489086924608 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0419 04:00:00.462932 133489086924608 train_utils.py:377] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0419 04:00:01.084418 133489086924608 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0419 04:00:01.085014 133489086924608 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0419 04:00:01.085196 133489086924608 checkpointing.py:688] Setting up checkpoint logger...
I0419 04:00:01.085262 133489086924608 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0419 04:00:01.085309 133489086924608 pytree_checkpoint_handler.py:589] save_device_host_concurrent_bytes=None
I0419 04:00:01.085662 133489086924608 base_pytree_checkpoint_handler.py:415] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7967d5375160>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0419 04:00:04.185575 133489086924608 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0419 04:00:04.185931 133489086924608 checkpoint_manager.py:709] [process=3][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79554fe8f350>}, handler_registry=None
I0419 04:00:04.186167 133489086924608 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79554fe8f350>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0419 04:00:04.186217 133489086924608 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79554fe967e0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0419 04:00:04.186268 133489086924608 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79554fe8f350>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79554fe8f350>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79554fe967e0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79554fe967e0>}).
I0419 04:00:04.186785 133489086924608 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.33
I0419 04:00:04.186872 133489086924608 async_checkpointer.py:177] [process=3][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x79554f5d4180> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0419 04:00:05.485335 133489086924608 checkpoint_manager.py:1818] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints
I0419 04:00:05.898070 133489086924608 checkpoint_manager.py:929] [process=3][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x796806f440b0>
I0419 04:00:05.898276 133489086924608 checkpointing.py:302] Checkpoint manager created!
I0419 04:00:06.809573 133489086924608 nnx_wrappers.py:455] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0419 04:00:06.809758 133489086924608 nnx_wrappers.py:455] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0419 04:00:07.607756 133489086924608 attentions.py:1084] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0419 04:00:07.607896 133489086924608 attentions.py:1084] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0419 04:00:07.762270 133489086924608 attentions.py:1085] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0419 04:00:07.762404 133489086924608 attentions.py:1085] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0419 04:00:07.925734 133489086924608 attentions.py:1150] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0419 04:00:07.925869 133489086924608 attentions.py:1150] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 04:00:08.080817 133489086924608 attentions.py:1151] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0419 04:00:08.080957 133489086924608 attentions.py:1151] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 04:00:08.235675 133489086924608 attentions.py:1152] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0419 04:00:08.235803 133489086924608 attentions.py:1152] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 04:00:08.398132 133489086924608 attentions.py:1193] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0419 04:00:08.398290 133489086924608 attentions.py:1193] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0419 04:00:08.421357 133489086924608 linears.py:541] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0419 04:00:08.421453 133489086924608 linears.py:541] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0419 04:00:23.986982 133489086924608 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0419 04:00:23.987120 133489086924608 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0419 04:00:32.816938 133489086924608 maxtext_utils.py:1740]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.817075 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.817138 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.817200 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.817315 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.817365 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.817426 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.817480 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.817521 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.817559 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.817596 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.817631 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.817669 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.817703 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.817735 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.817770 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.817805 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.817838 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.817870 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.817903 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.817935 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.817967 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.817998 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818028 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818060 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818093 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.818133 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818166 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818198 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.818229 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.818274 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.818304 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818334 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818383 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818438 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.818476 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818510 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818543 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.818574 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.818606 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.818636 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818665 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818697 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818729 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.818760 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818792 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.818823 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.818854 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.818891 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.818920 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818949 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.818981 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819013 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.819045 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819076 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819113 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.819145 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.819176 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.819205 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.819255 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.819294 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819327 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.819359 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819391 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819424 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.819455 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.819487 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.819516 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.819544 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.819576 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819607 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.819639 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819670 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819701 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.819732 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.819762 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.819790 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.819819 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.819850 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819885 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.819917 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819948 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.819978 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820008 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820039 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.820068 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820096 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820131 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820162 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.820193 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820224 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820267 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820299 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820330 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.820358 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820387 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820417 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820447 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.820478 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820509 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820539 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820570 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820600 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.820627 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820655 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820696 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820728 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.820759 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820790 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.820821 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820851 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.820884 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.820914 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820942 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.820974 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821005 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.821036 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821067 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821097 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.821131 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.821162 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.821189 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.821216 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.821258 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821291 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.821323 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821354 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821385 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.821415 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.821446 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.821473 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.821501 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.821532 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821563 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.821594 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821624 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821654 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.821685 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0419 04:00:32.821714 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:32.821742 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.821770 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0419 04:00:32.821800 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821831 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0419 04:00:32.821861 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821891 133489086924608 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0419 04:00:32.821941 133489086924608 maxtext_utils.py:1740]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   PartitionSpec('embed', 'vocab')
    Physical:  ('fsdp', None)

I0419 04:00:32.821986 133489086924608 maxtext_utils.py:1740]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   PartitionSpec('vocab', 'embed')
    Physical:  (None, 'fsdp')
I0419 04:00:36.755600 133489086924608 train.py:158] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0419 04:00:36.755696 133489086924608 train.py:158] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0419 04:00:36.770848 133489086924608 train.py:165] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0419 04:00:36.770907 133489086924608 train.py:165] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0419 04:01:33.458334 133489086924608 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.4 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0419 04:01:33.462163 133489086924608 metric_logger.py:289] number parameters: 1.104 billion
I0419 04:02:35.302855 133489086924608 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0419 04:02:35.561942 133489086924608 checkpointing.py:798] Waited 0.2590668201446533 seconds for step 0 to finish before starting checkpointing.
I0419 04:02:35.567160 133489086924608 checkpoint_manager.py:2013] [process=3][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0419 04:02:35.568971 133489086924608 checkpoint_manager.py:1518] [process=3] Saving checkpoint at step 0
I0419 04:02:35.571771 133489086924608 async_checkpointer.py:452] [process=3] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints/0.
I0419 04:02:37.030268 133489086924608 signaling_client.py:364] Using JaxDistributedSignalingClient
I0419 04:02:37.031454 133489086924608 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0419 04:02:37.032112 133489086924608 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0419 04:02:37.339119 133489086924608 base_pytree_checkpoint_handler.py:153] [process=3][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.309353s
I0419 04:02:37.339305 133489086924608 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.498 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 1.0295953750610352 s) (per-host)
I0419 04:02:37.339357 133489086924608 base_pytree_checkpoint_handler.py:737] [process=3][thread=MainThread] Initiated Pytree async_save. Time taken: 1.029657s (batch_requests_ready=0.701431s, total_serialization_initiated=0.328157s, others=0.000069s)
I0419 04:02:37.339606 133489086924608 composite_checkpoint_handler.py:715] [process=3][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 1.035229s (all_items=0.000020s, per_item={'items': '0.00002003'}, temp_paths=1.035209)
I0419 04:02:37.340620 133368884729600 async_checkpointer.py:79] [process=3][thread=async_save] Background save thread started.
I0419 04:02:37.340790 133489086924608 async_checkpointer.py:561] Finished blocking save. Time taken: 1.771746s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints/0.
I0419 04:02:37.795795 133489086924608 checkpoint_manager.py:1566] [process=3][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0419 04:02:37.796193 133364623333120 async_checkpointer.py:265] [process=3][thread=save_finalize] Waiting for background save thread=async_save.
I0419 04:02:37.796376 133489086924608 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776571355.567141, 'wait_for_prev_duration_secs': 6.461143493652344e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776571355.569011, 'checkpointer_blocking_duration_secs': 1.771890640258789, 'get_old_steps_start_time': 1776571357.3409216, 'get_old_steps_duration_secs': 2.6464462280273438e-05, 'checkpoint_manager_blocking_start_time': 1776571355.565384, 'checkpoint_manager_blocking_duration_secs': 2.2309529781341553}
I0419 04:02:37.796586 133489086924608 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0419 04:02:37.796641 133489086924608 max_utils.py:750] 
Memstats: After params initialized:
I0419 04:02:37.796699 133489086924608 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_10(process=3,(2,2,0,0))
I0419 04:02:37.796732 133489086924608 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_11(process=3,(3,2,0,0))
I0419 04:02:37.796760 133489086924608 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_14(process=3,(2,3,0,0))
I0419 04:02:37.796785 133489086924608 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_15(process=3,(3,3,0,0))
I0419 04:02:38.102200 133489086924608 metric_logger.py:185] completed step: 0, seconds: 61.841, TFLOP/s/device: 0.220, Tokens/s/device: 33.117, total_weights: 65536, loss: 10.877
I0419 04:02:38.342895 133489086924608 metric_logger.py:185] completed step: 1, seconds: 2.788, TFLOP/s/device: 4.874, Tokens/s/device: 734.595, total_weights: 65536, loss: 10.877
I0419 04:02:38.778726 133489086924608 metric_logger.py:185] completed step: 2, seconds: 0.116, TFLOP/s/device: 116.949, Tokens/s/device: 17627.819, total_weights: 65536, loss: 10.268
I0419 04:02:38.915339 133489086924608 metric_logger.py:185] completed step: 3, seconds: 0.436, TFLOP/s/device: 31.130, Tokens/s/device: 4692.211, total_weights: 65536, loss: 9.741
I0419 04:02:39.193283 133489086924608 metric_logger.py:185] completed step: 4, seconds: 0.143, TFLOP/s/device: 94.958, Tokens/s/device: 14313.070, total_weights: 65536, loss: 9.285
I0419 04:02:39.206248 133489086924608 metric_logger.py:185] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.602, Tokens/s/device: 15013.122, total_weights: 65536, loss: 8.900
I0419 04:03:01.702662    2664 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0419 04:03:01.829782 133489086924608 metric_logger.py:185] completed step: 6, seconds: 0.279, TFLOP/s/device: 48.679, Tokens/s/device: 7337.372, total_weights: 65536, loss: 8.602
I0419 04:03:01.966655 133489086924608 metric_logger.py:185] completed step: 7, seconds: 22.509, TFLOP/s/device: 0.604, Tokens/s/device: 90.986, total_weights: 65536, loss: 8.394
I0419 04:03:02.109590 133489086924608 metric_logger.py:185] completed step: 8, seconds: 0.126, TFLOP/s/device: 107.507, Tokens/s/device: 16204.583, total_weights: 65536, loss: 8.264
I0419 04:03:02.245784 133489086924608 checkpointing.py:794] Waiting for step 9 to finish before checkpoint...
I0419 04:03:02.249469 133489086924608 checkpointing.py:798] Waited 0.003703594207763672 seconds for step 9 to finish before starting checkpointing.
I0419 04:03:02.252994 133489086924608 checkpoint_manager.py:2024] [process=3][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0419 04:03:05.451015 133368893122304 array_metadata_store.py:203] [process=3][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_3
I0419 04:03:31.270906 133368884729600 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/gbytes_per_sec: 28.740 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 54.96114993095398 s) (per-host)
I0419 04:03:31.271032 133368884729600 async_checkpointer.py:90] [process=3][thread=async_save] 3 Handler Commit operations completed. Time taken: 53.930295s.
I0419 04:03:41.564152 133368884729600 async_checkpointer.py:144] [process=3][thread=async_save] Background save thread done. Time taken: 64.223398s.
I0419 04:03:41.564450 133364623333120 async_checkpointer.py:273] [process=3][thread=save_finalize] Done with waiting for background save thread=async_save.
I0419 04:03:41.564593 133364623333120 async_checkpointer.py:283] [process=3][thread=save_finalize] No errors found in background save thread=async_save.
I0419 04:03:41.564655 133364623333120 checkpoint_manager.py:2133] [process=3][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0419 04:03:41.572193 133364623333120 checkpoint_manager.py:2142] [process=3][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0419 04:03:41.572359 133489086924608 checkpoint_manager.py:2036] [process=3][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0419 04:03:41.572472 133489086924608 checkpoint_manager.py:1458] Waiting for previous save to complete took 39.319477 seconds. If this number is high, consider checkpointing less frequently.
I0419 04:03:41.574826 133489086924608 checkpoint_manager.py:1518] [process=3] Saving checkpoint at step 9
I0419 04:03:41.578319 133489086924608 async_checkpointer.py:452] [process=3] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints/9.
I0419 04:03:42.997824 133489086924608 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0419 04:03:42.997990 133489086924608 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0419 04:03:43.139579 133489086924608 base_pytree_checkpoint_handler.py:153] [process=3][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.143210s
I0419 04:03:43.139741 133489086924608 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.793 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.8601913452148438 s) (per-host)
I0419 04:03:43.139792 133489086924608 base_pytree_checkpoint_handler.py:737] [process=3][thread=MainThread] Initiated Pytree async_save. Time taken: 0.860250s (batch_requests_ready=0.699300s, total_serialization_initiated=0.160884s, others=0.000066s)
I0419 04:03:43.140073 133489086924608 composite_checkpoint_handler.py:715] [process=3][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.864576s (all_items=0.000015s, per_item={'items': '0.00001526'}, temp_paths=0.864561)
I0419 04:03:43.141041 133365666113280 async_checkpointer.py:79] [process=3][thread=async_save] Background save thread started.
I0419 04:03:43.141174 133489086924608 async_checkpointer.py:561] Finished blocking save. Time taken: 1.566274s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints/9.
I0419 04:03:43.156984 133489086924608 checkpoint_manager.py:1566] [process=3][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0419 04:03:43.157272 133364623333120 async_checkpointer.py:265] [process=3][thread=save_finalize] Waiting for background save thread=async_save.
I0419 04:03:43.157411 133489086924608 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776571382.2529635, 'wait_for_prev_duration_secs': 39.31947684288025, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776571421.5748665, 'checkpointer_blocking_duration_secs': 1.566509485244751, 'get_old_steps_start_time': 1776571423.1413975, 'get_old_steps_duration_secs': 2.8133392333984375e-05, 'checkpoint_manager_blocking_start_time': 1776571382.2510169, 'checkpoint_manager_blocking_duration_secs': 40.906357526779175}
I0419 04:03:43.157612 133489086924608 checkpointing.py:409] Started an asynchronous checkpoint save for step 9
I0419 04:03:43.157658 133489086924608 checkpoint_manager.py:2024] [process=3][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0419 04:03:49.584113 133368893122304 array_metadata_store.py:203] [process=3][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260419_025828/linen_xpk_feat_nnx_set_defaults_true_20260419_025828_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_3
I0419 04:04:25.217004 133365666113280 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/gbytes_per_sec: 36.788 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.93741011619568 s) (per-host)
I0419 04:04:25.217125 133365666113280 async_checkpointer.py:90] [process=3][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.075849s.
I0419 04:04:34.609208 133365666113280 async_checkpointer.py:144] [process=3][thread=async_save] Background save thread done. Time taken: 51.467916s.
I0419 04:04:34.609502 133364623333120 async_checkpointer.py:273] [process=3][thread=save_finalize] Done with waiting for background save thread=async_save.
I0419 04:04:34.609622 133364623333120 async_checkpointer.py:283] [process=3][thread=save_finalize] No errors found in background save thread=async_save.
I0419 04:04:34.609673 133364623333120 checkpoint_manager.py:2133] [process=3][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0419 04:04:34.612270 133364623333120 checkpoint_manager.py:2142] [process=3][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0419 04:04:34.612446 133489086924608 checkpoint_manager.py:2036] [process=3][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0419 04:04:34.612586 133489086924608 checkpoint_manager.py:2013] [process=3][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0419 04:04:34.613262 133489086924608 metric_logger.py:185] completed step: 9, seconds: 0.141, TFLOP/s/device: 96.591, Tokens/s/device: 14559.207, total_weights: 65536, loss: 8.188
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Sun Apr 19 04:04:45 UTC 2026
EXIT_CODE=0