MaxView

← Back to run

Log Summary

2026-04-23 02:01:26.344730: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1776909686.358063 2300539 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1776909686.361952 2300539 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1776909686.373473 2300539 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776909686.373488 2300539 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776909686.373490 2300539 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776909686.373491 2300539 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
`rope_scaling`'s factor field must be a float >= 1, got 40
`rope_scaling`'s beta_fast field must be a float, got 32
`rope_scaling`'s beta_slow field must be a float, got 1
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
2026-04-23 02:01:30.048109: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0423 02:01:30.183031 133867303257920 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag.
I0423 02:02:01.774880 133867303257920 max_utils.py:800] System Information: Jax Version: 0.8.1
I0423 02:02:01.775075 133867303257920 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0423 02:02:01.775116 133867303257920 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0423 02:02:01.775148 133867303257920 train_utils.py:348] WARNING: 'dataset_path' might be pointing your local file system
I0423 02:02:01.775170 133867303257920 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0423 02:02:01.775280 133867303257920 train.py:703] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper.
I0423 02:02:02.430383 133867303257920 maxtext_utils.py:1604] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0423 02:02:02.430642 133867303257920 checkpointing.py:677] Setting up checkpoint logger...
I0423 02:02:02.430686 133867303257920 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0423 02:02:02.430721 133867303257920 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0423 02:02:02.431171 133867303257920 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x79c01f7a7350>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0423 02:02:04.852381 133867303257920 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0423 02:02:04.852955 133867303257920 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79bec2d63170>}, handler_registry=None
I0423 02:02:04.853380 133867303257920 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79bec2d63170>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0423 02:02:04.853428 133867303257920 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79bec236c920>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0423 02:02:04.853462 133867303257920 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79bec2d63170>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x79bec2d63170>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79bec236c920>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x79bec236c920>}).
I0423 02:02:04.854016 133867303257920 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
I0423 02:02:04.854119 133867303257920 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x79be547e8860> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0423 02:02:04.985792 133867303257920 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints
I0423 02:02:04.986065 133867303257920 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x79bec236e360>
I0423 02:02:04.986356 133867303257920 checkpointing.py:301] Checkpoint manager created!
I0423 02:02:05.402097 133867303257920 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0423 02:02:05.402238 133867303257920 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0423 02:02:06.044930 133867303257920 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0423 02:02:06.045060 133867303257920 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0423 02:02:06.057454 133867303257920 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0423 02:02:06.057531 133867303257920 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0423 02:02:06.078786 133867303257920 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0423 02:02:06.078871 133867303257920 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0423 02:02:06.091385 133867303257920 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0423 02:02:06.091466 133867303257920 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0423 02:02:06.103853 133867303257920 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0423 02:02:06.103929 133867303257920 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0423 02:02:06.124775 133867303257920 attentions.py:1198] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0423 02:02:06.124863 133867303257920 attentions.py:1198] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0423 02:02:06.143637 133867303257920 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0423 02:02:06.143726 133867303257920 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None).
I0423 02:02:08.568816 133867303257920 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0423 02:02:08.568936 133867303257920 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
[DECOUPLED NO-OP] gcs_storage: using stubs.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] workload_monitor: using stub.
[DECOUPLED NO-OP] vertex_tensorboard: using stub.
fsdp: 8
I0423 02:02:13.897916 133867303257920 maxtext_utils.py:1707]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898143 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898188 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898234 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.898264 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898291 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898331 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898370 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.898400 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898426 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898452 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898478 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898503 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.898527 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898549 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898571 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898595 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.898619 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898643 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898665 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898687 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898709 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.898729 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898750 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898773 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898794 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.898817 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898846 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898869 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898890 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.898912 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.898932 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898953 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.898975 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.898997 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.899018 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899039 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899078 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899099 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899122 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.899145 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899166 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899188 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899210 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.899232 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899254 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899278 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899299 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899320 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.899341 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899362 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899384 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899406 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.899427 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899448 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899470 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899491 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899512 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.899532 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899552 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899573 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899594 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.899615 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899636 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899657 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899680 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899702 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.899723 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899743 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899765 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899786 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.899808 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899833 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899855 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899878 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.899900 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.899920 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899940 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.899961 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.899984 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.900005 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900027 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900059 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900085 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900106 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.900126 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900146 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900169 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900190 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.900212 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900233 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900254 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900275 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900296 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.900316 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900336 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900357 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900379 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.900400 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900422 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900444 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900465 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900486 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.900505 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900525 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900547 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900567 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.900589 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900610 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900630 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900651 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900672 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.900692 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900712 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900734 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900756 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.900778 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900799 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900820 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900844 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.900865 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.900885 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900905 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.900928 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900949 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.900971 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.900992 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901013 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.901034 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.901071 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.901093 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.901112 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.901133 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901154 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.901175 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901197 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901218 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.901240 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0423 02:02:13.901261 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0423 02:02:13.901280 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.901300 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0423 02:02:13.901321 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901342 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0423 02:02:13.901364 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901386 133867303257920 maxtext_utils.py:1707]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0423 02:02:13.901425 133867303257920 maxtext_utils.py:1707]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   PartitionSpec('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0423 02:02:13.901459 133867303257920 maxtext_utils.py:1707]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   PartitionSpec('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0423 02:02:16.912881 133867303257920 train.py:155] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0423 02:02:16.912974 133867303257920 train.py:155] train/xent Physical: float32[8,2048]............................................. ('fsdp', None).
I0423 02:02:16.924562 133867303257920 train.py:162] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0423 02:02:16.924610 133867303257920 train.py:162] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None).
I0423 02:02:30.031201 133867303257920 max_utils.py:791] Total memory size: 3.3 GB, Output size: 1.5 GB, Temp size: 1.8 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB.
I0423 02:02:30.032656 133867303257920 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter.
I0423 02:02:30.033349 133867303257920 metric_logger.py:301] number parameters: 1.104 billion
I0423 02:02:59.165156 133867303257920 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0423 02:02:59.273691 133867303257920 checkpointing.py:776] Waited 0.10860276222229004 seconds for step 0 to finish before starting checkpointing.
I0423 02:02:59.276707 133867303257920 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0423 02:02:59.277007 133867303257920 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0
I0423 02:02:59.277896 133867303257920 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0.
I0423 02:02:59.375822 133867303257920 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient
I0423 02:02:59.511327 133836268635712 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0
I0423 02:02:59.512000 133867303257920 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0423 02:02:59.512087 133867303257920 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0423 02:03:00.233787 133836258149952 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/items
I0423 02:03:00.362905 133867303257920 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.852144s
I0423 02:03:00.363398 133867303257920 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 12.517 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 985 milliseconds) (per-host)
I0423 02:03:00.363462 133867303257920 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.986056s (batch_requests_ready=0.105284s, total_serialization_initiated=0.880417s, others=0.000355s)
I0423 02:03:00.363604 133867303257920 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.987148s (all_items=0.000047s, per_item={'items': '0.00004745'}, temp_paths=0.987101)
I0423 02:03:00.364626 133836033754688 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0423 02:03:00.364810 133867303257920 async_checkpointer.py:561] Finished blocking save. Time taken: 1.087748s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0.
I0423 02:03:00.365019 133867303257920 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0423 02:03:00.365278 133867303257920 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776909779.276684, 'wait_for_prev_duration_secs': 5.936622619628906e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776909779.2770302, 'checkpointer_blocking_duration_secs': 1.087853193283081, 'get_old_steps_start_time': 1776909780.3649116, 'get_old_steps_duration_secs': 5.340576171875e-05, 'checkpoint_manager_blocking_start_time': 1776909779.276432, 'checkpoint_manager_blocking_duration_secs': 1.0888104438781738}
I0423 02:03:00.365466 133867303257920 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0423 02:03:00.365546 133867303257920 max_utils.py:750] 
Memstats: After params initialized:
I0423 02:03:00.365861 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_0(process=0,(0,0,0,0))
I0423 02:03:00.365893 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_1(process=0,(1,0,0,0))
I0423 02:03:00.365913 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_2(process=0,(0,1,0,0))
I0423 02:03:00.365934 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_3(process=0,(1,1,0,0))
I0423 02:03:00.365952 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_4(process=0,(0,2,0,0))
I0423 02:03:00.365971 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_5(process=0,(1,2,0,0))
I0423 02:03:00.365988 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_6(process=0,(0,3,0,0))
I0423 02:03:00.366005 133867303257920 max_utils.py:756] 	Using (GB) 2.14 / 31.25 (6.848000%) on TPU_7(process=0,(1,3,0,0))
I0423 02:03:00.373661 133836226692672 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776909780118505966, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776909780118505966, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0423 02:03:00.373932 133836279121472 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0423 02:03:00.799038 2302660 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json
I0423 02:03:00.799126 2302660 google_auth_provider.cc:156] Using OAuth2 AuthProvider
I0423 02:03:01.129373 133867303257920 metric_logger.py:196] completed step: 0, seconds: 29.131, TFLOP/s/device: 0.466, Tokens/s/device: 70.304, total_weights: 16384, loss: 10.887, lm_loss: 10.887, perplexity: 53501.250
I0423 02:03:01.132282 133867303257920 metric_logger.py:281] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/tensorboard/'
I0423 02:03:01.149579 133867303257920 metric_logger.py:196] completed step: 1, seconds: 1.671, TFLOP/s/device: 8.130, Tokens/s/device: 1225.390, total_weights: 16384, loss: 10.887, lm_loss: 10.887, perplexity: 53501.250
I0423 02:03:01.250208 133867303257920 metric_logger.py:196] completed step: 2, seconds: 0.307, TFLOP/s/device: 44.205, Tokens/s/device: 6663.131, total_weights: 16384, loss: 9.787, lm_loss: 9.787, perplexity: 17806.746
I0423 02:03:01.362791 133867303257920 metric_logger.py:196] completed step: 3, seconds: 0.016, TFLOP/s/device: 864.651, Tokens/s/device: 130329.642, total_weights: 16384, loss: 8.853, lm_loss: 8.853, perplexity: 6997.778
I0423 02:03:01.576284 133836044240448 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0
I0423 02:03:23.010005 133867303257920 metric_logger.py:196] completed step: 4, seconds: 0.103, TFLOP/s/device: 131.658, Tokens/s/device: 19844.961, total_weights: 16384, loss: 7.982, lm_loss: 7.982, perplexity: 2927.066
I0423 02:03:27.314468 133867303257920 metric_logger.py:196] completed step: 5, seconds: 0.112, TFLOP/s/device: 121.559, Tokens/s/device: 18322.687, total_weights: 16384, loss: 7.248, lm_loss: 7.248, perplexity: 1405.074
I0423 02:03:27.590990 133867303257920 metric_logger.py:196] completed step: 6, seconds: 24.970, TFLOP/s/device: 0.544, Tokens/s/device: 82.017, total_weights: 16384, loss: 6.690, lm_loss: 6.690, perplexity: 804.397
I0423 02:03:27.776598 133867303257920 metric_logger.py:196] completed step: 7, seconds: 1.123, TFLOP/s/device: 12.098, Tokens/s/device: 1823.532, total_weights: 16384, loss: 6.307, lm_loss: 6.307, perplexity: 548.399
I0423 02:03:27.995761 133867303257920 metric_logger.py:196] completed step: 8, seconds: 0.302, TFLOP/s/device: 45.004, Tokens/s/device: 6783.523, total_weights: 16384, loss: 6.070, lm_loss: 6.070, perplexity: 432.506
I0423 02:03:28.000666 133867303257920 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0423 02:03:28.014468 133867303257920 checkpointing.py:776] Waited 0.013959169387817383 seconds for step 9 to finish before starting checkpointing.
I0423 02:03:28.018759 133867303257920 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0423 02:03:54.734252 133836247664192 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 53.975275s (commit=53.474829s, array_metadata_write=0.500446s)
I0423 02:03:54.736841 133836033754688 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 228.268 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 55 seconds) (per-host)
I0423 02:03:54.736930 133836033754688 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 54.372174s.
I0423 02:03:54.993527 133836033754688 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776909780118505966, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0423 02:03:55.200744 133836033754688 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0423 02:03:55.430428 133836226692672 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776909780118505966, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0423 02:03:55.591219 133836033754688 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0423 02:03:55.592516 133836033754688 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.557426s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/items
I0423 02:03:55.593353 133836033754688 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0/items
I0423 02:03:55.827105 133836033754688 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0
I0423 02:03:56.543728 133836033754688 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0`.
I0423 02:03:56.544523 133836033754688 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 57.267465s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/0
I0423 02:03:56.544629 133836033754688 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 56.179879s.
I0423 02:03:56.544861 133836279121472 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0423 02:03:56.544928 133836279121472 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0423 02:03:56.545008 133836279121472 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0423 02:03:56.545077 133836279121472 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0423 02:03:56.545292 133867303257920 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0423 02:03:56.545475 133867303257920 checkpoint_manager.py:1441] Waiting for previous save to complete took 28.528701 seconds. If this number is high, consider checkpointing less frequently.
I0423 02:03:56.549908 133867303257920 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9
I0423 02:03:56.550410 133867303257920 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9.
I0423 02:03:56.726683 133836033754688 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9
I0423 02:03:56.782999 133867303257920 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0423 02:03:56.783235 133867303257920 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0423 02:03:57.158039 133867303257920 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.375889s
I0423 02:03:57.158720 133867303257920 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 23.819 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 518 milliseconds) (per-host)
I0423 02:03:57.158818 133867303257920 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.518245s (batch_requests_ready=0.109390s, total_serialization_initiated=0.408289s, others=0.000566s)
I0423 02:03:57.158979 133867303257920 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.519360s (all_items=0.000060s, per_item={'items': '0.00005960'}, temp_paths=0.519300)
I0423 02:03:57.159867 133836056823360 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0423 02:03:57.160004 133867303257920 async_checkpointer.py:561] Finished blocking save. Time taken: 0.610032s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9.
I0423 02:03:57.160300 133867303257920 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0423 02:03:57.160543 133836279121472 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0423 02:03:57.160751 133867303257920 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776909808.0167234, 'wait_for_prev_duration_secs': 28.528700590133667, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776909836.5499382, 'checkpointer_blocking_duration_secs': 0.6101987361907959, 'get_old_steps_start_time': 1776909837.1601636, 'get_old_steps_duration_secs': 8.7738037109375e-05, 'checkpoint_manager_blocking_start_time': 1776909808.0158195, 'checkpoint_manager_blocking_duration_secs': 29.144875526428223}
I0423 02:03:57.160979 133867303257920 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0423 02:03:57.161018 133867303257920 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0423 02:03:57.433966 133836044240448 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9/items
I0423 02:03:58.730478 133836077794880 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0
I0423 02:04:41.372424 133836067309120 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 43.449160s (commit=42.962443s, array_metadata_write=0.486717s)
I0423 02:04:41.379587 133836056823360 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 282.456 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 44 seconds) (per-host)
I0423 02:04:41.379663 133836056823360 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 44.219584s.
I0423 02:04:41.828627 133836056823360 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0423 02:04:42.224465 133836056823360 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0423 02:04:42.229888 133836056823360 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.570588s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9/items
I0423 02:04:42.230834 133836056823360 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9/items
I0423 02:04:42.460906 133836056823360 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9
I0423 02:04:43.136366 133836056823360 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9`.
I0423 02:04:43.137155 133836056823360 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 46.587189s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260423_012248/linen_main_20260423_012248_13_scan_layers_false/checkpoints/9
I0423 02:04:43.137248 133836056823360 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 45.977176s.
I0423 02:04:43.137496 133836279121472 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0423 02:04:43.137658 133836279121472 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0423 02:04:43.137757 133836279121472 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0423 02:04:43.137811 133836279121472 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0423 02:04:43.138103 133867303257920 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0423 02:04:43.138288 133867303257920 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0423 02:04:43.139301 133867303257920 metric_logger.py:196] completed step: 9, seconds: 0.020, TFLOP/s/device: 686.080, Tokens/s/device: 103413.452, total_weights: 16384, loss: 5.930, lm_loss: 5.930, perplexity: 376.179
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops