MaxView

‹ 11_optimizer_offload_trueCase: 13_scan_layers_false13_scan_layers_true ›

Metrics: main (574ad3fb9) vs feat/nnx-post-train-fixes (574ad3fb9)

Metricmain  574ad3fb9feat/nnx-post-train-fixes  574ad3fb9Diff (feat/nnx-post-train-fixes − main)
Parameters1.104 billion1.104 billion
Final loss8.18808.18800
TFLOP/s99.83899.347-0.491
Tok/s15048.614974.7-73.943
Avg s/step2.9412.993+0.052
Memory %2.622.620
JAX0.9.20.9.2

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  574ad3fb9  ·  main_20260420_060028  ·  full log
XPK Start: Mon Apr 20 06:41:49 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-20 06:42:14.133275: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0420 06:42:14.313203 138468681766720 max_utils.py:273] Attempting to initialize the jax distributed system...
I0420 06:42:23.354364 138468681766720 distributed.py:149] Starting JAX distributed service on [::]:8482
I0420 06:42:23.356749 138468681766720 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-kdvz0-slice-job-0-0.mt-13-scan-layers-false-kdvz0:8482
I0420 06:42:24.384992 138468681766720 max_utils.py:284] Jax distributed system initialized!
I0420 06:42:30.293484 138468681766720 max_utils.py:800] System Information: Jax Version: 0.9.2
I0420 06:42:30.293585 138468681766720 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0420 06:42:30.293625 138468681766720 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0420 06:42:30.293660 138468681766720 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0420 06:42:30.992797 138468681766720 maxtext_utils.py:1551] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0420 06:42:30.993067 138468681766720 checkpointing.py:677] Setting up checkpoint logger...
I0420 06:42:30.993122 138468681766720 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0420 06:42:30.993166 138468681766720 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0420 06:42:30.993503 138468681766720 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7def1f91b4d0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0420 06:42:33.853060 138468681766720 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0420 06:42:33.853291 138468681766720 checkpoint_manager.py:708] [process=1][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ddb141fac30>}, handler_registry=None
I0420 06:42:33.853528 138468681766720 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ddb141fac30>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0420 06:42:33.853577 138468681766720 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7ddb0c644e00>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0420 06:42:33.853613 138468681766720 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ddb141fac30>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ddb141fac30>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7ddb0c644e00>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7ddb0c644e00>}).
I0420 06:42:33.853946 138468681766720 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0420 06:42:33.854018 138468681766720 async_checkpointer.py:192] [process=1][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7ddb0c3cfd80> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0420 06:42:34.537691 138468681766720 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints
I0420 06:42:34.857793 138468681766720 checkpoint_manager.py:929] [process=1][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7ddb0c644320>
I0420 06:42:34.857958 138468681766720 checkpointing.py:301] Checkpoint manager created!
I0420 06:42:35.787881 138468681766720 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0420 06:42:35.787982 138468681766720 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0420 06:42:36.168169 138468681766720 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0420 06:42:36.168260 138468681766720 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0420 06:42:36.184138 138468681766720 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0420 06:42:36.184208 138468681766720 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0420 06:42:36.213417 138468681766720 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 06:42:36.213483 138468681766720 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 06:42:36.229807 138468681766720 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 06:42:36.229869 138468681766720 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 06:42:36.245826 138468681766720 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 06:42:36.245889 138468681766720 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 06:42:36.275402 138468681766720 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0420 06:42:36.275475 138468681766720 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 06:42:36.297620 138468681766720 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0420 06:42:36.297684 138468681766720 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0420 06:42:39.157586 138468681766720 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0420 06:42:39.157722 138468681766720 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0420 06:42:46.331203 138468681766720 maxtext_utils.py:1654]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.331333 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.331385 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.331443 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.331484 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.331520 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.331574 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.331626 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.331666 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.331703 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.331757 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.331792 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.331825 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.331855 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.331885 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.331917 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.331960 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.331995 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332027 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332059 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332089 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332120 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.332148 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.332177 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.332207 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332238 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.332268 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332298 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332328 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332358 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332389 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.332418 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.332448 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.332478 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332509 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.332539 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332570 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332602 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332631 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332660 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.332689 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.332749 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.332787 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332818 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.332849 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332879 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.332910 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332938 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.332968 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.332997 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333025 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333055 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333085 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.333114 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333144 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333174 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.333204 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.333234 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.333262 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333291 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333321 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333351 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.333381 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333410 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333441 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.333471 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.333500 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.333528 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333555 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333584 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333613 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.333642 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333672 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333700 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.333747 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.333777 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.333805 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333833 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.333864 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333896 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.333927 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333956 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.333986 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334015 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334044 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.334071 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334099 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334128 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334158 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.334188 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334219 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334249 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334278 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334307 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.334336 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334363 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334392 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334422 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.334451 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334480 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334510 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334539 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334568 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.334596 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334623 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334652 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334681 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.334722 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334759 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334789 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334818 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.334847 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.334874 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334902 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.334931 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.334960 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.334989 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335020 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335048 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.335078 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.335108 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.335135 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.335162 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.335191 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335219 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.335248 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335278 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335306 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.335335 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.335364 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.335392 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.335420 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.335448 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335479 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.335508 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335537 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335566 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.335595 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 06:42:46.335624 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 06:42:46.335652 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.335679 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 06:42:46.335720 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335756 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 06:42:46.335788 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335817 138468681766720 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 06:42:46.335866 138468681766720 maxtext_utils.py:1654]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0420 06:42:46.335909 138468681766720 maxtext_utils.py:1654]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0420 06:42:50.879448 138468681766720 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0420 06:42:50.879543 138468681766720 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0420 06:42:50.894832 138468681766720 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0420 06:42:50.894889 138468681766720 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0420 06:43:47.505022 138468681766720 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0420 06:43:47.506234 138468681766720 metric_logger.py:301] number parameters: 1.104 billion
I0420 06:44:48.170660 138468681766720 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0420 06:44:48.410920 138468681766720 checkpointing.py:776] Waited 0.24024033546447754 seconds for step 0 to finish before starting checkpointing.
I0420 06:44:48.414790 138468681766720 checkpoint_manager.py:2009] [process=1][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0420 06:44:48.416608 138468681766720 checkpoint_manager.py:1512] [process=1] Saving checkpoint at step 0
I0420 06:44:48.417960 138468681766720 event_tracking.py:70] [process=1] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints/0.
I0420 06:44:49.180703 138468681766720 signaling_client.py:364] Using JaxDistributedSignalingClient
I0420 06:44:49.181722 138468681766720 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0420 06:44:49.181845 138468681766720 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 06:44:49.483228 138468681766720 base_pytree_checkpoint_handler.py:154] [process=1][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.303036s
I0420 06:44:49.483399 138468681766720 base_pytree_checkpoint_handler.py:130] [process=1] /jax/orbax/write/blocking_gbytes_per_sec: 4.571 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.33750343322753906 s) (per-host)
I0420 06:44:49.483450 138468681766720 base_pytree_checkpoint_handler.py:768] [process=1][thread=MainThread] Initiated Pytree async_save. Time taken: 0.337564s (batch_requests_ready=0.018203s, total_serialization_initiated=0.319292s, others=0.000070s)
I0420 06:44:49.483732 138468681766720 composite_checkpoint_handler.py:715] [process=1][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.341804s (all_items=0.000018s, per_item={'items': '0.00001836'}, temp_paths=0.341786)
I0420 06:44:49.484615 138468681766720 event_tracking.py:125] [process=1] [async] Finished blocking save in 1.07 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints/0.
I0420 06:44:49.484981 138342181537536 async_checkpointer.py:76] [process=1][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-20 07:04:49.484942
I0420 06:44:49.526901 138468681766720 checkpoint_manager.py:1560] [process=1][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 06:44:49.527215 138341693515520 async_checkpointer.py:280] [process=1][thread=save_finalize] Waiting for background save thread=async_save.
I0420 06:44:49.527375 138468681766720 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776667488.4147706, 'wait_for_prev_duration_secs': 5.984306335449219e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776667488.416645, 'checkpointer_blocking_duration_secs': 1.068490982055664, 'get_old_steps_start_time': 1776667489.485162, 'get_old_steps_duration_secs': 3.0517578125e-05, 'checkpoint_manager_blocking_start_time': 1776667488.4124584, 'checkpoint_manager_blocking_duration_secs': 1.1148765087127686}
I0420 06:44:49.527568 138468681766720 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0420 06:44:49.527620 138468681766720 max_utils.py:750] 
Memstats: After params initialized:
I0420 06:44:49.527671 138468681766720 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_2(process=1,(2,0,0,0))
I0420 06:44:49.527723 138468681766720 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_3(process=1,(3,0,0,0))
I0420 06:44:49.527753 138468681766720 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_6(process=1,(2,1,0,0))
I0420 06:44:49.527778 138468681766720 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_7(process=1,(3,1,0,0))
I0420 06:44:49.894740 138468681766720 metric_logger.py:196] completed step: 0, seconds: 60.664, TFLOP/s/device: 0.224, Tokens/s/device: 33.760, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0420 06:44:50.036060 138468681766720 metric_logger.py:196] completed step: 1, seconds: 1.715, TFLOP/s/device: 7.921, Tokens/s/device: 1193.962, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0420 06:44:50.453330 138468681766720 metric_logger.py:196] completed step: 2, seconds: 0.015, TFLOP/s/device: 932.734, Tokens/s/device: 140591.748, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28792.170
I0420 06:44:50.591086 138468681766720 metric_logger.py:196] completed step: 3, seconds: 0.417, TFLOP/s/device: 32.589, Tokens/s/device: 4912.190, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 17000.162
I0420 06:44:50.870464 138468681766720 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 94.927, Tokens/s/device: 14308.471, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10778.646
I0420 06:44:50.882442 138468681766720 metric_logger.py:196] completed step: 5, seconds: 0.138, TFLOP/s/device: 98.375, Tokens/s/device: 14828.115, total_weights: 65536, loss: 8.900, lm_loss: 8.900, perplexity: 7335.217
I0420 06:44:52.590640    2809 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0420 06:44:54.700959 138345405863680 array_metadata_store.py:203] [process=1][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_1
I0420 06:45:14.499557 138468681766720 metric_logger.py:196] completed step: 6, seconds: 0.280, TFLOP/s/device: 48.528, Tokens/s/device: 7314.730, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.142
I0420 06:45:14.635888 138468681766720 metric_logger.py:196] completed step: 7, seconds: 23.485, TFLOP/s/device: 0.579, Tokens/s/device: 87.203, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4417.804
I0420 06:45:14.771932 138468681766720 metric_logger.py:196] completed step: 8, seconds: 0.143, TFLOP/s/device: 95.221, Tokens/s/device: 14352.793, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.722
I0420 06:45:14.907588 138468681766720 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0420 06:45:14.910975 138468681766720 checkpointing.py:776] Waited 0.0034062862396240234 seconds for step 9 to finish before starting checkpointing.
I0420 06:45:14.913746 138468681766720 checkpoint_manager.py:2020] [process=1][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 06:45:22.696106 138342181537536 base_pytree_checkpoint_handler.py:130] [process=1] /jax/orbax/write/gbytes_per_sec: 47.082 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.5501754283905 s) (per-host)
I0420 06:45:22.696230 138342181537536 async_checkpointer.py:90] [process=1][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.211138s.
I0420 06:45:31.695575 138342181537536 async_checkpointer.py:160] [process=1][thread=async_save] Background save thread done. Time taken: 42.210466s.
I0420 06:45:31.695917 138341693515520 async_checkpointer.py:288] [process=1][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 06:45:31.696043 138341693515520 async_checkpointer.py:298] [process=1][thread=save_finalize] No errors found in background save thread=async_save.
I0420 06:45:31.696094 138341693515520 checkpoint_manager.py:2137] [process=1][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0420 06:45:31.697567 138341693515520 checkpoint_manager.py:2146] [process=1][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0420 06:45:31.697777 138468681766720 checkpoint_manager.py:2032] [process=1][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0420 06:45:31.697931 138468681766720 checkpoint_manager.py:1452] Waiting for previous save to complete took 16.784194 seconds. If this number is high, consider checkpointing less frequently.
I0420 06:45:31.700606 138468681766720 checkpoint_manager.py:1512] [process=1] Saving checkpoint at step 9
I0420 06:45:31.702599 138468681766720 event_tracking.py:70] [process=1] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints/9.
I0420 06:45:32.013494 138468681766720 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0420 06:45:32.013667 138468681766720 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 06:45:32.145973 138468681766720 base_pytree_checkpoint_handler.py:154] [process=1][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.133805s
I0420 06:45:32.146134 138468681766720 base_pytree_checkpoint_handler.py:130] [process=1] /jax/orbax/write/blocking_gbytes_per_sec: 9.226 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.16719698905944824 s) (per-host)
I0420 06:45:32.146183 138468681766720 base_pytree_checkpoint_handler.py:768] [process=1][thread=MainThread] Initiated Pytree async_save. Time taken: 0.167255s (batch_requests_ready=0.017935s, total_serialization_initiated=0.149254s, others=0.000066s)
I0420 06:45:32.146477 138468681766720 composite_checkpoint_handler.py:715] [process=1][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.171475s (all_items=0.000016s, per_item={'items': '0.00001597'}, temp_paths=0.171459)
I0420 06:45:32.147259 138468681766720 event_tracking.py:125] [process=1] [async] Finished blocking save in 0.45 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints/9.
I0420 06:45:32.149056 138341693515520 async_checkpointer.py:76] [process=1][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-20 07:05:32.149019
I0420 06:45:32.151212 138468681766720 checkpoint_manager.py:1560] [process=1][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 06:45:32.151453 138341701908224 async_checkpointer.py:280] [process=1][thread=save_finalize] Waiting for background save thread=async_save.
I0420 06:45:32.151587 138468681766720 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776667514.913701, 'wait_for_prev_duration_secs': 16.784193754196167, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776667531.7006464, 'checkpointer_blocking_duration_secs': 0.44855213165283203, 'get_old_steps_start_time': 1776667532.1492233, 'get_old_steps_duration_secs': 3.2901763916015625e-05, 'checkpoint_manager_blocking_start_time': 1776667514.911834, 'checkpoint_manager_blocking_duration_secs': 17.239718914031982}
I0420 06:45:32.151862 138468681766720 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0420 06:45:32.151911 138468681766720 checkpoint_manager.py:2020] [process=1][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 06:45:37.178866 138345405863680 array_metadata_store.py:203] [process=1][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260420_060028/linen_xpk_main_20260420_060028_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_1
I0420 06:46:13.947937 138341693515520 base_pytree_checkpoint_handler.py:130] [process=1] /jax/orbax/write/gbytes_per_sec: 37.637 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 41.96895718574524 s) (per-host)
I0420 06:46:13.948061 138341693515520 async_checkpointer.py:90] [process=1][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.798891s.
I0420 06:46:23.371451 138341693515520 async_checkpointer.py:160] [process=1][thread=async_save] Background save thread done. Time taken: 51.222266s.
I0420 06:46:23.371801 138341701908224 async_checkpointer.py:288] [process=1][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 06:46:23.371925 138341701908224 async_checkpointer.py:298] [process=1][thread=save_finalize] No errors found in background save thread=async_save.
I0420 06:46:23.371975 138341701908224 checkpoint_manager.py:2137] [process=1][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0420 06:46:23.373721 138341701908224 checkpoint_manager.py:2146] [process=1][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0420 06:46:23.373915 138468681766720 checkpoint_manager.py:2032] [process=1][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0420 06:46:23.374068 138468681766720 checkpoint_manager.py:2009] [process=1][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0420 06:46:23.375086 138468681766720 metric_logger.py:196] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.838, Tokens/s/device: 15048.644, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3599.216
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Mon Apr 20 06:46:35 UTC 2026
EXIT_CODE=0
XPK Start: Mon Apr 20 16:36:36 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-20 16:37:00.309762: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0420 16:37:00.487686 137747061356352 max_utils.py:273] Attempting to initialize the jax distributed system...
I0420 16:37:09.528883 137747061356352 distributed.py:149] Starting JAX distributed service on [::]:8482
I0420 16:37:09.531217 137747061356352 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-kdago-slice-job-0-0.mt-13-scan-layers-false-kdago:8482
I0420 16:37:10.946591 137747061356352 max_utils.py:284] Jax distributed system initialized!
I0420 16:37:16.956937 137747061356352 max_utils.py:800] System Information: Jax Version: 0.9.2
I0420 16:37:16.957034 137747061356352 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0420 16:37:16.957076 137747061356352 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0420 16:37:16.957132 137747061356352 train_utils.py:378] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0420 16:37:18.013408 137747061356352 maxtext_utils.py:1718] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0420 16:37:18.013982 137747061356352 maxtext_utils.py:1718] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0420 16:37:18.014174 137747061356352 checkpointing.py:688] Setting up checkpoint logger...
I0420 16:37:18.014227 137747061356352 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0420 16:37:18.014272 137747061356352 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0420 16:37:18.014585 137747061356352 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7d4705150110>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0420 16:37:20.637231 137747061356352 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0420 16:37:20.637465 137747061356352 checkpoint_manager.py:708] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d45f92def30>}, handler_registry=None
I0420 16:37:20.637704 137747061356352 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d45f92def30>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0420 16:37:20.637752 137747061356352 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d45f9247710>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0420 16:37:20.637786 137747061356352 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d45f92def30>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d45f92def30>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d45f9247710>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d45f9247710>}).
I0420 16:37:20.638109 137747061356352 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0420 16:37:20.638179 137747061356352 async_checkpointer.py:192] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7d32d45dec00> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0420 16:37:21.988908 137747061356352 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints
I0420 16:37:21.991227 137747061356352 checkpoint_manager.py:929] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7d45f92dcef0>
I0420 16:37:21.991335 137747061356352 checkpointing.py:302] Checkpoint manager created!
I0420 16:37:22.927017 137747061356352 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0420 16:37:22.927135 137747061356352 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0420 16:37:23.310495 137747061356352 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0420 16:37:23.310583 137747061356352 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0420 16:37:23.326418 137747061356352 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0420 16:37:23.326478 137747061356352 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0420 16:37:23.356000 137747061356352 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 16:37:23.356067 137747061356352 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 16:37:23.372209 137747061356352 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 16:37:23.372271 137747061356352 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 16:37:23.388178 137747061356352 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 16:37:23.388236 137747061356352 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 16:37:23.417472 137747061356352 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0420 16:37:23.417543 137747061356352 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0420 16:37:23.439587 137747061356352 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0420 16:37:23.439650 137747061356352 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0420 16:37:26.333096 137747061356352 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0420 16:37:26.333215 137747061356352 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0420 16:37:33.585852 137747061356352 maxtext_utils.py:1827]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.585978 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586030 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586101 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.586145 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.586182 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.586233 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586284 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.586323 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586357 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586391 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586425 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586460 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.586492 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.586523 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.586556 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586591 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.586628 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586660 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586690 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586720 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586750 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.586778 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.586806 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.586836 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586866 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.586896 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586926 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.586957 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.586989 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587020 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.587049 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587089 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587125 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587157 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.587188 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587217 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587247 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587275 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587305 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.587334 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587362 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587393 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587425 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.587457 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587487 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587516 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587544 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587574 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.587601 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587638 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587668 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587698 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.587728 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587757 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587786 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587815 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.587845 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.587873 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587900 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.587930 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.587960 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.587990 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588020 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588050 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588105 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588143 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.588172 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588200 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588230 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588259 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.588289 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588320 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588350 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588380 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588409 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.588437 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588464 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588494 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588525 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.588555 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588586 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588619 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588649 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588678 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.588705 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588732 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588761 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588790 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.588820 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588849 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.588878 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588907 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.588936 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.588964 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.588991 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589020 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589049 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.589116 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589159 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589192 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589223 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589254 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.589282 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589310 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589339 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589368 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.589396 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589426 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589454 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589483 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589511 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.589539 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589566 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589596 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589631 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.589660 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589688 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589716 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589745 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589773 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.589800 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589827 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.589855 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589884 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.589913 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589941 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.589969 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.589998 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.590025 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.590052 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.590088 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.590123 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.590151 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.590181 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.590210 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.590240 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.590268 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0420 16:37:33.590296 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0420 16:37:33.590323 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.590350 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 16:37:33.590378 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.590407 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 16:37:33.590435 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.590464 137747061356352 maxtext_utils.py:1827]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0420 16:37:33.590510 137747061356352 maxtext_utils.py:1827]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0420 16:37:33.590553 137747061356352 maxtext_utils.py:1827]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0420 16:37:38.216310 137747061356352 train.py:157] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0420 16:37:38.216409 137747061356352 train.py:157] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0420 16:37:38.231681 137747061356352 train.py:164] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0420 16:37:38.231740 137747061356352 train.py:164] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0420 16:38:34.918634 137747061356352 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0420 16:38:34.926962 137747061356352 metric_logger.py:301] number parameters: 1.104 billion
I0420 16:39:37.193990 137747061356352 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0420 16:39:37.434150 137747061356352 checkpointing.py:798] Waited 0.24014067649841309 seconds for step 0 to finish before starting checkpointing.
I0420 16:39:37.437605 137747061356352 checkpoint_manager.py:2009] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0420 16:39:37.439602 137747061356352 checkpoint_manager.py:1512] [process=0] Saving checkpoint at step 0
I0420 16:39:37.441328 137747061356352 event_tracking.py:70] [process=0] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0.
I0420 16:39:38.209967 137747061356352 signaling_client.py:364] Using JaxDistributedSignalingClient
I0420 16:39:38.257089 137747061356352 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0420 16:39:38.257233 137747061356352 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 16:39:38.497088 137618595505920 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0
I0420 16:39:38.570265 137747061356352 base_pytree_checkpoint_handler.py:154] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.314501s
I0420 16:39:38.571164 137747061356352 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/blocking_gbytes_per_sec: 4.376 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.35251522064208984 s) (per-host)
I0420 16:39:38.571241 137747061356352 base_pytree_checkpoint_handler.py:768] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.352612s (batch_requests_ready=0.018075s, total_serialization_initiated=0.333708s, others=0.000829s)
I0420 16:39:38.571397 137747061356352 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.356673s (all_items=0.000016s, per_item={'items': '0.00001621'}, temp_paths=0.356657)
I0420 16:39:38.572280 137747061356352 event_tracking.py:125] [process=0] [async] Finished blocking save in 1.13 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0.
I0420 16:39:38.572735 137618603898624 async_checkpointer.py:76] [process=0][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-20 16:59:38.572696
I0420 16:39:38.595466 137747061356352 checkpoint_manager.py:1560] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 16:39:38.595748 137618552493824 async_checkpointer.py:280] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0420 16:39:38.595918 137747061356352 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776703177.4375865, 'wait_for_prev_duration_secs': 6.079673767089844e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776703177.4396846, 'checkpointer_blocking_duration_secs': 1.1331977844238281, 'get_old_steps_start_time': 1776703178.572912, 'get_old_steps_duration_secs': 2.9325485229492188e-05, 'checkpoint_manager_blocking_start_time': 1776703177.435787, 'checkpoint_manager_blocking_duration_secs': 1.160088062286377}
I0420 16:39:38.596127 137747061356352 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0420 16:39:38.596179 137747061356352 max_utils.py:750] 
Memstats: After params initialized:
I0420 16:39:38.596230 137747061356352 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_0(process=0,(0,0,0,0))
I0420 16:39:38.596263 137747061356352 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_1(process=0,(1,0,0,0))
I0420 16:39:38.596291 137747061356352 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_4(process=0,(0,1,0,0))
I0420 16:39:38.596316 137747061356352 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_5(process=0,(1,1,0,0))
I0420 16:39:38.977288 137747061356352 metric_logger.py:196] completed step: 0, seconds: 61.667, TFLOP/s/device: 0.220, Tokens/s/device: 33.210, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0420 16:39:38.980216 137747061356352 metric_logger.py:281] To see full metrics 'tensorboard --logdir=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/tensorboard/'
I0420 16:39:39.424847 137747061356352 metric_logger.py:196] completed step: 1, seconds: 1.773, TFLOP/s/device: 7.664, Tokens/s/device: 1155.240, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0420 16:39:39.556490 137747061356352 metric_logger.py:196] completed step: 2, seconds: 0.455, TFLOP/s/device: 29.884, Tokens/s/device: 4504.415, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28794.738
I0420 16:39:39.692589 137747061356352 metric_logger.py:196] completed step: 3, seconds: 0.013, TFLOP/s/device: 1057.776, Tokens/s/device: 159439.471, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16999.881
I0420 16:39:40.494576 137618067027712 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776703180071388905, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776703180071388905, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0420 16:39:40.767593 137618578720512 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/items
I0420 16:39:42.172197    2870 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0420 16:39:44.655903 137622318978816 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0
I0420 16:40:03.835075 137747061356352 metric_logger.py:196] completed step: 4, seconds: 0.133, TFLOP/s/device: 102.064, Tokens/s/device: 15384.153, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10778.902
I0420 16:40:03.850119 137747061356352 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.875, Tokens/s/device: 15054.285, total_weights: 65536, loss: 8.900, lm_loss: 8.900, perplexity: 7335.425
I0420 16:40:03.980382 137747061356352 metric_logger.py:196] completed step: 6, seconds: 24.143, TFLOP/s/device: 0.563, Tokens/s/device: 84.828, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.912
I0420 16:40:04.116752 137747061356352 metric_logger.py:196] completed step: 7, seconds: 0.012, TFLOP/s/device: 1143.024, Tokens/s/device: 172289.055, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.049
I0420 16:40:04.252856 137747061356352 metric_logger.py:196] completed step: 8, seconds: 0.132, TFLOP/s/device: 103.062, Tokens/s/device: 15534.688, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.437
I0420 16:40:04.388410 137747061356352 checkpointing.py:794] Waiting for step 9 to finish before checkpoint...
I0420 16:40:04.391722 137747061356352 checkpointing.py:798] Waited 0.00333404541015625 seconds for step 9 to finish before starting checkpointing.
I0420 16:40:04.395775 137747061356352 checkpoint_manager.py:2020] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 16:40:16.734821 137618560886528 base_pytree_checkpoint_handler.py:1282] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 34.594613s (commit=32.914058s, array_metadata_write=1.680554s)
I0420 16:40:16.736640 137618603898624 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/gbytes_per_sec: 41.009 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 38.51794481277466 s) (per-host)
I0420 16:40:16.736758 137618603898624 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 38.163908s.
I0420 16:40:17.514000 137618603898624 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776703180071388905, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0420 16:40:18.345644 137618067027712 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776703180071388905, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0420 16:40:18.641616 137618603898624 ocdbt_utils.py:49] Param validation support for Zarr3 will be added later (b/362328389).
I0420 16:40:18.760635 137618603898624 array_metadata_store.py:411] [process=0][thread=async_save] Validated ArrayMetadata from all 8 hosts. Time taken: 0.002184s.
I0420 16:40:20.099327 137618603898624 base_pytree_checkpoint_handler.py:1406] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 2.455696s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/items
I0420 16:40:20.100896 137618603898624 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0/items
I0420 16:40:20.669412 137618603898624 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0
I0420 16:40:22.097658 137618603898624 atomicity.py:847] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0`.
I0420 16:40:22.098567 137618603898624 event_tracking.py:138] [process=0] [async] Finished save (blocking + background) in 44.66 seconds @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/0
I0420 16:40:22.100112 137618603898624 async_checkpointer.py:160] [process=0][thread=async_save] Background save thread done. Time taken: 43.527261s.
I0420 16:40:22.100269 137618552493824 async_checkpointer.py:288] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 16:40:22.100365 137618552493824 async_checkpointer.py:298] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0420 16:40:22.100428 137618552493824 checkpoint_manager.py:2137] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0420 16:40:22.102585 137618552493824 checkpoint_manager.py:2146] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0420 16:40:22.102788 137747061356352 checkpoint_manager.py:2032] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0420 16:40:22.102931 137747061356352 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.707156 seconds. If this number is high, consider checkpointing less frequently.
I0420 16:40:22.105574 137747061356352 checkpoint_manager.py:1512] [process=0] Saving checkpoint at step 9
I0420 16:40:22.107633 137747061356352 event_tracking.py:70] [process=0] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9.
I0420 16:40:22.436757 137747061356352 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0420 16:40:22.436920 137747061356352 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 16:40:22.559861 137747061356352 base_pytree_checkpoint_handler.py:154] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.124372s
I0420 16:40:22.560655 137747061356352 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/blocking_gbytes_per_sec: 9.557 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.16140484809875488 s) (per-host)
I0420 16:40:22.560723 137747061356352 base_pytree_checkpoint_handler.py:768] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.161495s (batch_requests_ready=0.017512s, total_serialization_initiated=0.143272s, others=0.000711s)
I0420 16:40:22.560846 137747061356352 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.165719s (all_items=0.000016s, per_item={'items': '0.00001645'}, temp_paths=0.165702)
I0420 16:40:22.561620 137747061356352 event_tracking.py:125] [process=0] [async] Finished blocking save in 0.46 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9.
I0420 16:40:22.561929 137618552493824 async_checkpointer.py:76] [process=0][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-20 17:00:22.561897
I0420 16:40:22.564151 137747061356352 checkpoint_manager.py:1560] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 16:40:22.564400 137618041849600 async_checkpointer.py:280] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0420 16:40:22.564561 137747061356352 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776703204.395742, 'wait_for_prev_duration_secs': 17.70715594291687, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776703222.1056142, 'checkpointer_blocking_duration_secs': 0.4564528465270996, 'get_old_steps_start_time': 1776703222.5621064, 'get_old_steps_duration_secs': 2.9802322387695312e-05, 'checkpoint_manager_blocking_start_time': 1776703204.3925583, 'checkpoint_manager_blocking_duration_secs': 18.171971321105957}
I0420 16:40:22.564749 137747061356352 checkpointing.py:409] Started an asynchronous checkpoint save for step 9
I0420 16:40:22.564797 137747061356352 checkpoint_manager.py:2020] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 16:40:22.653108 137618603898624 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9
I0420 16:40:24.453148 137618578720512 atomicity.py:140] Creating tmp directory gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9/items
I0420 16:40:27.583672 137618569279232 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0
I0420 16:41:08.303265 137618560886528 base_pytree_checkpoint_handler.py:1282] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 42.705872s (commit=41.062258s, array_metadata_write=1.643614s)
I0420 16:41:08.305063 137618552493824 base_pytree_checkpoint_handler.py:130] [process=0] /jax/orbax/write/gbytes_per_sec: 34.410 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 45.90579557418823 s) (per-host)
I0420 16:41:08.305203 137618552493824 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 45.743160s.
I0420 16:41:10.196489 137618552493824 ocdbt_utils.py:49] Param validation support for Zarr3 will be added later (b/362328389).
I0420 16:41:10.295171 137618552493824 array_metadata_store.py:411] [process=0][thread=async_save] Validated ArrayMetadata from all 8 hosts. Time taken: 0.002174s.
I0420 16:41:11.656582 137618552493824 base_pytree_checkpoint_handler.py:1406] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 2.537121s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9/items
I0420 16:41:11.658366 137618552493824 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9/items
I0420 16:41:12.240785 137618552493824 atomicity.py:666] Finalizing gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9
I0420 16:41:14.108548 137618552493824 atomicity.py:847] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9`.
I0420 16:41:14.109456 137618552493824 event_tracking.py:138] [process=0] [async] Finished save (blocking + background) in 52.00 seconds @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260420_153543/linen_xpk_feat_nnx_post_train_fixes_20260420_153543_13_scan_layers_false/checkpoints/9
I0420 16:41:14.111189 137618552493824 async_checkpointer.py:160] [process=0][thread=async_save] Background save thread done. Time taken: 51.549145s.
I0420 16:41:14.111401 137618041849600 async_checkpointer.py:288] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 16:41:14.111519 137618041849600 async_checkpointer.py:298] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0420 16:41:14.111588 137618041849600 checkpoint_manager.py:2137] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0420 16:41:14.113368 137618041849600 checkpoint_manager.py:2146] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0420 16:41:14.113546 137747061356352 checkpoint_manager.py:2032] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0420 16:41:14.113694 137747061356352 checkpoint_manager.py:2009] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0420 16:41:14.114683 137747061356352 metric_logger.py:196] completed step: 9, seconds: 0.137, TFLOP/s/device: 99.347, Tokens/s/device: 14974.701, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.885
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Mon Apr 20 16:41:27 UTC 2026
EXIT_CODE=0