MaxView

‹ 11_optimizer_offload_trueCase: 13_scan_layers_false13_scan_layers_true ›

Metrics: main (8a17c3d19) vs feat/nnx-set-defaults-true (73213e044)

Metricmain  8a17c3d19feat/nnx-set-defaults-true  73213e044Diff (feat/nnx-set-defaults-true − main)
Parameters1.104 billion1.104 billion
Final loss8.18808.18800
TFLOP/s97.24799.587+2.34
Tok/s14658.115010.8+352.714
Avg s/step2.9892.993+0.004
Memory %2.622.620
JAX0.9.20.8.1

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  8a17c3d19  ·  main_20260422_071422  ·  full log
XPK Start: Wed Apr 22 07:56:28 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-22 07:56:53.539227: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0422 07:56:53.749433 136728036083520 max_utils.py:273] Attempting to initialize the jax distributed system...
I0422 07:57:02.789418 136728036083520 distributed.py:149] Starting JAX distributed service on [::]:8482
I0422 07:57:02.791807 136728036083520 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-xk6cb-slice-job-0-0.mt-13-scan-layers-false-xk6cb:8482
I0422 07:57:04.500064 136728036083520 max_utils.py:284] Jax distributed system initialized!
I0422 07:57:10.799138 136728036083520 max_utils.py:800] System Information: Jax Version: 0.9.2
I0422 07:57:10.799248 136728036083520 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0422 07:57:10.799288 136728036083520 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0422 07:57:10.799324 136728036083520 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0422 07:57:11.499504 136728036083520 maxtext_utils.py:1565] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0422 07:57:11.499800 136728036083520 checkpointing.py:677] Setting up checkpoint logger...
I0422 07:57:11.499856 136728036083520 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0422 07:57:11.499901 136728036083520 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0422 07:57:11.500239 136728036083520 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7c59e8314e90>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0422 07:57:14.408658 136728036083520 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0422 07:57:14.408905 136728036083520 checkpoint_manager.py:708] [process=4][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c45342adac0>}, handler_registry=None
I0422 07:57:14.409158 136728036083520 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c45342adac0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0422 07:57:14.409207 136728036083520 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c45342b1100>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0422 07:57:14.409243 136728036083520 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c45342adac0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7c45342adac0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c45342b1100>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7c45342b1100>}).
I0422 07:57:14.409559 136728036083520 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0422 07:57:14.409636 136728036083520 async_checkpointer.py:192] [process=4][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7c4494385800> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0422 07:57:15.461984 136728036083520 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints
I0422 07:57:15.561089 136728036083520 checkpoint_manager.py:929] [process=4][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7c45342ae6c0>
I0422 07:57:15.561281 136728036083520 checkpointing.py:301] Checkpoint manager created!
I0422 07:57:16.497013 136728036083520 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0422 07:57:16.497118 136728036083520 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 07:57:16.885684 136728036083520 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 07:57:16.885788 136728036083520 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 07:57:16.902005 136728036083520 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 07:57:16.902063 136728036083520 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 07:57:16.932334 136728036083520 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 07:57:16.932410 136728036083520 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:57:16.948814 136728036083520 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 07:57:16.948876 136728036083520 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:57:16.965099 136728036083520 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 07:57:16.965162 136728036083520 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:57:16.994581 136728036083520 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0422 07:57:16.994650 136728036083520 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:57:17.016319 136728036083520 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0422 07:57:17.016396 136728036083520 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0422 07:57:19.876745 136728036083520 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0422 07:57:19.876873 136728036083520 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0422 07:57:27.068953 136728036083520 maxtext_utils.py:1668]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.069087 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.069150 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.069207 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.069248 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.069285 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.069336 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.069389 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.069435 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.069472 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.069506 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.069537 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.069572 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.069605 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.069635 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.069667 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.069713 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.069770 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.069804 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.069836 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.069868 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.069899 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.069943 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.069991 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070028 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070060 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.070091 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070121 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070152 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070185 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070216 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.070244 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070271 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070301 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070331 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.070360 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070389 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070423 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070454 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070484 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.070513 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070541 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070570 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070599 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.070628 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070657 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070685 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070729 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070759 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.070786 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070813 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.070842 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070871 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.070899 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070928 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.070957 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.070986 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071015 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.071043 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071070 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071099 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071127 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.071156 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071184 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071214 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071243 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071271 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.071299 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071326 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071354 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071382 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.071411 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071446 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071475 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071506 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071536 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.071563 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071590 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071619 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071647 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.071677 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071716 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071747 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071776 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.071805 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.071831 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071858 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.071887 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071916 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.071944 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.071973 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072002 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072031 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072059 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.072086 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072113 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072140 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072170 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.072200 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072229 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072258 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072287 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072315 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.072343 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072369 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072399 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072432 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.072461 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072490 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072519 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072548 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072577 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.072604 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072631 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072659 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072688 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.072738 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072768 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072796 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072825 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.072854 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.072882 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072910 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.072939 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.072968 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.072997 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073025 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073053 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.073081 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.073109 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.073136 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.073163 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.073191 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073219 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.073249 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073278 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073306 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.073334 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 07:57:27.073362 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 07:57:27.073388 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.073419 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:57:27.073449 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073477 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:57:27.073506 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073534 136728036083520 maxtext_utils.py:1668]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 07:57:27.073582 136728036083520 maxtext_utils.py:1668]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0422 07:57:27.073626 136728036083520 maxtext_utils.py:1668]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0422 07:57:31.539814 136728036083520 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 07:57:31.539909 136728036083520 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 07:57:31.555388 136728036083520 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 07:57:31.555454 136728036083520 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 07:58:28.078902 136728036083520 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0422 07:58:28.080091 136728036083520 metric_logger.py:301] number parameters: 1.104 billion
I0422 07:59:28.995189 136728036083520 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0422 07:59:29.231106 136728036083520 checkpointing.py:776] Waited 0.2359004020690918 seconds for step 0 to finish before starting checkpointing.
I0422 07:59:29.237364 136728036083520 checkpoint_manager.py:2009] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 07:59:29.239396 136728036083520 checkpoint_manager.py:1512] [process=4] Saving checkpoint at step 0
I0422 07:59:29.240754 136728036083520 event_tracking.py:70] [process=4] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints/0.
I0422 07:59:29.571367 136728036083520 signaling_client.py:364] Using JaxDistributedSignalingClient
I0422 07:59:29.572362 136728036083520 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0422 07:59:29.572490 136728036083520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 07:59:29.871040 136728036083520 base_pytree_checkpoint_handler.py:154] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.300181s
I0422 07:59:29.871211 136728036083520 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/blocking_gbytes_per_sec: 4.609 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.33472323417663574 s) (per-host)
I0422 07:59:29.871260 136728036083520 base_pytree_checkpoint_handler.py:768] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.334788s (batch_requests_ready=0.017615s, total_serialization_initiated=0.317101s, others=0.000072s)
I0422 07:59:29.871507 136728036083520 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.338960s (all_items=0.000018s, per_item={'items': '0.00001764'}, temp_paths=0.338943)
I0422 07:59:29.872332 136728036083520 event_tracking.py:125] [process=4] [async] Finished blocking save in 0.63 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints/0.
I0422 07:59:29.872621 136599730398976 async_checkpointer.py:76] [process=4][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 08:19:29.872589
I0422 07:59:29.914392 136728036083520 checkpoint_manager.py:1560] [process=4][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 07:59:29.914775 136599701010176 async_checkpointer.py:280] [process=4][thread=save_finalize] Waiting for background save thread=async_save.
I0422 07:59:29.914940 136728036083520 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776844769.2373457, 'wait_for_prev_duration_secs': 6.365776062011719e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776844769.2394357, 'checkpointer_blocking_duration_secs': 0.6333003044128418, 'get_old_steps_start_time': 1776844769.872762, 'get_old_steps_duration_secs': 2.9802322387695312e-05, 'checkpoint_manager_blocking_start_time': 1776844769.2332716, 'checkpoint_manager_blocking_duration_secs': 0.6816270351409912}
I0422 07:59:29.915133 136728036083520 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0422 07:59:29.915192 136728036083520 max_utils.py:750] 
Memstats: After params initialized:
I0422 07:59:29.915243 136728036083520 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_16(process=4,(0,4,0,0))
I0422 07:59:29.915276 136728036083520 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_17(process=4,(1,4,0,0))
I0422 07:59:29.915303 136728036083520 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_20(process=4,(0,5,0,0))
I0422 07:59:29.915328 136728036083520 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_21(process=4,(1,5,0,0))
I0422 07:59:30.268102 136728036083520 metric_logger.py:196] completed step: 0, seconds: 60.915, TFLOP/s/device: 0.223, Tokens/s/device: 33.621, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0422 07:59:30.434109 136728036083520 metric_logger.py:196] completed step: 1, seconds: 1.264, TFLOP/s/device: 10.748, Tokens/s/device: 1620.099, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0422 07:59:30.893530 136728036083520 metric_logger.py:196] completed step: 2, seconds: 0.039, TFLOP/s/device: 352.491, Tokens/s/device: 53131.324, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28794.662
I0422 07:59:31.032796 136728036083520 metric_logger.py:196] completed step: 3, seconds: 0.460, TFLOP/s/device: 29.535, Tokens/s/device: 4451.806, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16999.801
I0422 07:59:31.311975 136728036083520 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.276, Tokens/s/device: 14361.046, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10778.169
I0422 07:59:31.323734 136728036083520 metric_logger.py:196] completed step: 5, seconds: 0.139, TFLOP/s/device: 97.602, Tokens/s/device: 14711.692, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.274
I0422 07:59:32.783293    2942 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0422 07:59:35.290511 136599709402880 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_4
I0422 07:59:55.749938 136728036083520 metric_logger.py:196] completed step: 6, seconds: 0.280, TFLOP/s/device: 48.498, Tokens/s/device: 7310.161, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5441.088
I0422 07:59:55.889051 136728036083520 metric_logger.py:196] completed step: 7, seconds: 24.295, TFLOP/s/device: 0.559, Tokens/s/device: 84.297, total_weights: 65536, loss: 8.394, lm_loss: 8.394, perplexity: 4418.633
I0422 07:59:56.027459 136728036083520 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.732, Tokens/s/device: 14429.750, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.639
I0422 07:59:56.167772 136728036083520 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0422 07:59:56.171149 136728036083520 checkpointing.py:776] Waited 0.0033941268920898438 seconds for step 9 to finish before starting checkpointing.
I0422 07:59:56.175509 136728036083520 checkpoint_manager.py:2020] [process=4][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 08:00:03.344923 136599730398976 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/gbytes_per_sec: 46.722 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.80838990211487 s) (per-host)
I0422 08:00:03.345050 136599730398976 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.471626s.
I0422 08:00:13.519863 136599730398976 async_checkpointer.py:160] [process=4][thread=async_save] Background save thread done. Time taken: 43.646422s.
I0422 08:00:13.520134 136599701010176 async_checkpointer.py:288] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 08:00:13.520250 136599701010176 async_checkpointer.py:298] [process=4][thread=save_finalize] No errors found in background save thread=async_save.
I0422 08:00:13.520301 136599701010176 checkpoint_manager.py:2137] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0422 08:00:13.522238 136599701010176 checkpoint_manager.py:2146] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0422 08:00:13.522425 136728036083520 checkpoint_manager.py:2032] [process=4][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0422 08:00:13.522580 136728036083520 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.347071 seconds. If this number is high, consider checkpointing less frequently.
I0422 08:00:13.525124 136728036083520 checkpoint_manager.py:1512] [process=4] Saving checkpoint at step 9
I0422 08:00:13.527115 136728036083520 event_tracking.py:70] [process=4] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints/9.
I0422 08:00:14.268903 136728036083520 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0422 08:00:14.269069 136728036083520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 08:00:14.396311 136728036083520 base_pytree_checkpoint_handler.py:154] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.128850s
I0422 08:00:14.396470 136728036083520 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/blocking_gbytes_per_sec: 9.524 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.16196393966674805 s) (per-host)
I0422 08:00:14.396522 136728036083520 base_pytree_checkpoint_handler.py:768] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.162025s (batch_requests_ready=0.017436s, total_serialization_initiated=0.144521s, others=0.000069s)
I0422 08:00:14.396848 136728036083520 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.166282s (all_items=0.000021s, per_item={'items': '0.00002122'}, temp_paths=0.166261)
I0422 08:00:14.397617 136728036083520 event_tracking.py:125] [process=4] [async] Finished blocking save in 0.87 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints/9.
I0422 08:00:14.398009 136599701010176 async_checkpointer.py:76] [process=4][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 08:20:14.397970
I0422 08:00:14.410377 136728036083520 checkpoint_manager.py:1560] [process=4][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 08:00:14.410647 136599713613568 async_checkpointer.py:280] [process=4][thread=save_finalize] Waiting for background save thread=async_save.
I0422 08:00:14.410829 136728036083520 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776844796.1754787, 'wait_for_prev_duration_secs': 17.347070932388306, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776844813.5251698, 'checkpointer_blocking_duration_secs': 0.8729941844940186, 'get_old_steps_start_time': 1776844814.3981936, 'get_old_steps_duration_secs': 3.123283386230469e-05, 'checkpoint_manager_blocking_start_time': 1776844796.172037, 'checkpoint_manager_blocking_duration_secs': 18.23875856399536}
I0422 08:00:14.411036 136728036083520 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0422 08:00:14.411083 136728036083520 checkpoint_manager.py:2020] [process=4][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 08:00:19.765872 136599722006272 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_4
I0422 08:00:56.727680 136599701010176 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/gbytes_per_sec: 37.173 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.49313950538635 s) (per-host)
I0422 08:00:56.727812 136599701010176 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.329679s.
I0422 08:01:04.524870 136599701010176 async_checkpointer.py:160] [process=4][thread=async_save] Background save thread done. Time taken: 50.126722s.
I0422 08:01:04.525144 136599713613568 async_checkpointer.py:288] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 08:01:04.525275 136599713613568 async_checkpointer.py:298] [process=4][thread=save_finalize] No errors found in background save thread=async_save.
I0422 08:01:04.525320 136599713613568 checkpoint_manager.py:2137] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0422 08:01:04.527455 136599713613568 checkpoint_manager.py:2146] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0422 08:01:04.527634 136728036083520 checkpoint_manager.py:2032] [process=4][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0422 08:01:04.527804 136728036083520 checkpoint_manager.py:2009] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 08:01:04.528797 136728036083520 metric_logger.py:196] completed step: 9, seconds: 0.140, TFLOP/s/device: 97.247, Tokens/s/device: 14658.097, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3599.045
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Wed Apr 22 08:01:14 UTC 2026
EXIT_CODE=0
XPK Start: Wed Apr 22 16:35:31 UTC 2026
2026-04-22 16:35:35.154545: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1776875735.167802      10 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1776875735.171572      10 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1776875735.183168      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776875735.183188      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776875735.183191      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776875735.183193      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-22 16:35:54.292152: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0422 16:35:54.821510 139095656052544 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-22 16:36:03,862:jax._src.distributed:140: Starting JAX distributed service on [::]:8482
I0422 16:36:03.862509 139095656052544 distributed.py:140] Starting JAX distributed service on [::]:8482
INFO:2026-04-22 16:36:03,864:jax._src.distributed:157: Connecting to JAX distributed service on mt-13-scan-layers-false-qmlkw-slice-job-0-0.mt-13-scan-layers-false-qmlkw:8482
I0422 16:36:03.864990 139095656052544 distributed.py:157] Connecting to JAX distributed service on mt-13-scan-layers-false-qmlkw-slice-job-0-0.mt-13-scan-layers-false-qmlkw:8482
I0422 16:36:04.944058 139095656052544 max_utils.py:284] Jax distributed system initialized!
I0422 16:36:10.807355 139095656052544 max_utils.py:800] System Information: Jax Version: 0.8.1
I0422 16:36:10.807462 139095656052544 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0422 16:36:10.807504 139095656052544 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0422 16:36:10.807539 139095656052544 train_utils.py:377] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0422 16:36:11.426604 139095656052544 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0422 16:36:11.427220 139095656052544 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0422 16:36:11.427402 139095656052544 checkpointing.py:688] Setting up checkpoint logger...
I0422 16:36:11.427452 139095656052544 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0422 16:36:11.427502 139095656052544 pytree_checkpoint_handler.py:589] save_device_host_concurrent_bytes=None
I0422 16:36:11.427886 139095656052544 base_pytree_checkpoint_handler.py:415] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7e81503b0e00>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0422 16:36:14.334071 139095656052544 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0422 16:36:14.334315 139095656052544 checkpoint_manager.py:709] [process=3][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e6f51639610>}, handler_registry=None
I0422 16:36:14.334554 139095656052544 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e6f51639610>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0422 16:36:14.334605 139095656052544 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e8177dd7aa0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0422 16:36:14.334641 139095656052544 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e6f51639610>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e6f51639610>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e8177dd7aa0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e8177dd7aa0>}).
I0422 16:36:14.335170 139095656052544 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.33
I0422 16:36:14.335251 139095656052544 async_checkpointer.py:177] [process=3][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7e6f51024180> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0422 16:36:15.651263 139095656052544 checkpoint_manager.py:1818] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints
I0422 16:36:16.073319 139095656052544 checkpoint_manager.py:929] [process=3][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7e6f51636a20>
I0422 16:36:16.073509 139095656052544 checkpointing.py:302] Checkpoint manager created!
I0422 16:36:16.996320 139095656052544 nnx_wrappers.py:455] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0422 16:36:16.996514 139095656052544 nnx_wrappers.py:455] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 16:36:17.785817 139095656052544 attentions.py:1084] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 16:36:17.785977 139095656052544 attentions.py:1084] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 16:36:17.941210 139095656052544 attentions.py:1085] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 16:36:17.941341 139095656052544 attentions.py:1085] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 16:36:18.105438 139095656052544 attentions.py:1150] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 16:36:18.105578 139095656052544 attentions.py:1150] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 16:36:18.259260 139095656052544 attentions.py:1151] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 16:36:18.259391 139095656052544 attentions.py:1151] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 16:36:18.419431 139095656052544 attentions.py:1152] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 16:36:18.419574 139095656052544 attentions.py:1152] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 16:36:18.586558 139095656052544 attentions.py:1193] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0422 16:36:18.586696 139095656052544 attentions.py:1193] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 16:36:18.610059 139095656052544 linears.py:541] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0422 16:36:18.610159 139095656052544 linears.py:541] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0422 16:36:33.976096 139095656052544 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0422 16:36:33.976227 139095656052544 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0422 16:36:42.848244 139095656052544 maxtext_utils.py:1740]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.848380 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.848438 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.848510 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.848573 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.848631 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.848702 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.848783 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.848849 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.848905 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.848987 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.849045 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.849104 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.849152 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.849206 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.849258 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.849313 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.849366 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.849417 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.849472 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.849524 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.849576 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.849627 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.849672 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.849722 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.849777 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.849828 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.849878 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.849951 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.850012 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.850060 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.850105 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.850150 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.850197 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.850244 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.850291 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.850339 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.850387 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.850434 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.850483 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.850530 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.850575 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.850624 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.850671 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.850718 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.850772 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.850821 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.850871 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.850916 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.850976 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.851016 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.851066 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851110 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.851160 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851203 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851252 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.851296 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.851347 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.851389 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.851427 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.851476 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851520 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.851571 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851615 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851666 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.851716 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.851768 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.851812 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.851856 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.851899 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.851959 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.852010 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852057 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852107 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.852155 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.852200 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.852245 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.852285 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.852334 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852381 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.852429 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852475 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852522 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.852568 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.852613 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.852655 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.852697 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.852748 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852799 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.852848 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852894 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.852955 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.853003 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.853053 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.853094 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.853137 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.853182 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.853229 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.853274 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.853334 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.853379 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.853430 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.853473 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.853524 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.853567 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.853613 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.853661 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.853708 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.853760 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.853810 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.853858 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.853905 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.854003 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.854053 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.854106 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.854151 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.854203 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.854248 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.854297 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.854342 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.854391 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.854432 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.854477 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.854521 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.854571 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.854616 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.854666 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.854712 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.854763 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.854811 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.854853 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.854896 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.854953 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.855007 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.855056 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.855102 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.855149 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.855195 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0422 16:36:42.855242 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0422 16:36:42.855285 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.855327 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0422 16:36:42.855375 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.855422 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 16:36:42.855470 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.855515 139095656052544 maxtext_utils.py:1740]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0422 16:36:42.855588 139095656052544 maxtext_utils.py:1740]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   PartitionSpec('embed', 'vocab')
    Physical:  ('fsdp', None)
I0422 16:36:42.855658 139095656052544 maxtext_utils.py:1740]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   PartitionSpec('vocab', 'embed')
    Physical:  (None, 'fsdp')

I0422 16:36:46.773552 139095656052544 train.py:158] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 16:36:46.773650 139095656052544 train.py:158] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 16:36:46.788609 139095656052544 train.py:165] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 16:36:46.788671 139095656052544 train.py:165] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 16:37:43.542782 139095656052544 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.4 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0422 16:37:43.546536 139095656052544 metric_logger.py:289] number parameters: 1.104 billion
I0422 16:38:45.344189 139095656052544 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0422 16:38:45.639389 139095656052544 checkpointing.py:798] Waited 0.2951805591583252 seconds for step 0 to finish before starting checkpointing.
I0422 16:38:45.644732 139095656052544 checkpoint_manager.py:2013] [process=3][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 16:38:45.646778 139095656052544 checkpoint_manager.py:1518] [process=3] Saving checkpoint at step 0
I0422 16:38:45.649901 139095656052544 async_checkpointer.py:452] [process=3] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints/0.
I0422 16:38:46.507115 139095656052544 signaling_client.py:364] Using JaxDistributedSignalingClient
I0422 16:38:46.508380 139095656052544 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0422 16:38:46.509073 139095656052544 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 16:38:46.814261 139095656052544 base_pytree_checkpoint_handler.py:153] [process=3][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.307658s
I0422 16:38:46.814426 139095656052544 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.820 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.8477001190185547 s) (per-host)
I0422 16:38:46.814476 139095656052544 base_pytree_checkpoint_handler.py:737] [process=3][thread=MainThread] Initiated Pytree async_save. Time taken: 0.847760s (batch_requests_ready=0.520951s, total_serialization_initiated=0.326743s, others=0.000066s)
I0422 16:38:46.814728 139095656052544 composite_checkpoint_handler.py:715] [process=3][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.852956s (all_items=0.000019s, per_item={'items': '0.00001884'}, temp_paths=0.852937)
I0422 16:38:46.815781 138975930996480 async_checkpointer.py:79] [process=3][thread=async_save] Background save thread started.
I0422 16:38:46.815975 139095656052544 async_checkpointer.py:561] Finished blocking save. Time taken: 1.169122s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints/0.
I0422 16:38:47.035247 139095656052544 checkpoint_manager.py:1566] [process=3][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 16:38:47.035647 138972181264128 async_checkpointer.py:265] [process=3][thread=save_finalize] Waiting for background save thread=async_save.
I0422 16:38:47.035822 139095656052544 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776875925.6447146, 'wait_for_prev_duration_secs': 6.604194641113281e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776875925.646819, 'checkpointer_blocking_duration_secs': 1.1692614555358887, 'get_old_steps_start_time': 1776875926.8161016, 'get_old_steps_duration_secs': 2.574920654296875e-05, 'checkpoint_manager_blocking_start_time': 1776875925.642624, 'checkpoint_manager_blocking_duration_secs': 1.3931562900543213}
I0422 16:38:47.036149 139095656052544 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0422 16:38:47.036205 139095656052544 max_utils.py:750] 
Memstats: After params initialized:
I0422 16:38:47.036263 139095656052544 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_10(process=3,(2,2,0,0))
I0422 16:38:47.036298 139095656052544 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_11(process=3,(3,2,0,0))
I0422 16:38:47.036326 139095656052544 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_14(process=3,(2,3,0,0))
I0422 16:38:47.036353 139095656052544 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_15(process=3,(3,3,0,0))
I0422 16:38:47.351025 139095656052544 metric_logger.py:185] completed step: 0, seconds: 61.798, TFLOP/s/device: 0.220, Tokens/s/device: 33.140, total_weights: 65536, loss: 10.877
I0422 16:38:47.487147 139095656052544 metric_logger.py:185] completed step: 1, seconds: 1.995, TFLOP/s/device: 6.810, Tokens/s/device: 1026.406, total_weights: 65536, loss: 10.877
I0422 16:38:47.905811 139095656052544 metric_logger.py:185] completed step: 2, seconds: 0.017, TFLOP/s/device: 778.008, Tokens/s/device: 117269.812, total_weights: 65536, loss: 10.268
I0422 16:38:48.042061 139095656052544 metric_logger.py:185] completed step: 3, seconds: 0.414, TFLOP/s/device: 32.840, Tokens/s/device: 4949.945, total_weights: 65536, loss: 9.741
I0422 16:38:48.319390 139095656052544 metric_logger.py:185] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.295, Tokens/s/device: 14363.866, total_weights: 65536, loss: 9.285
I0422 16:38:48.331959 139095656052544 metric_logger.py:185] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.897, Tokens/s/device: 15057.495, total_weights: 65536, loss: 8.901
I0422 16:39:12.135123 139095656052544 metric_logger.py:185] completed step: 6, seconds: 0.279, TFLOP/s/device: 48.753, Tokens/s/device: 7348.588, total_weights: 65536, loss: 8.602
I0422 16:39:12.271606 139095656052544 metric_logger.py:185] completed step: 7, seconds: 23.673, TFLOP/s/device: 0.574, Tokens/s/device: 86.513, total_weights: 65536, loss: 8.393
I0422 16:39:12.407837 139095656052544 metric_logger.py:185] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.807, Tokens/s/device: 14441.146, total_weights: 65536, loss: 8.264
I0422 16:39:12.543923 139095656052544 checkpointing.py:794] Waiting for step 9 to finish before checkpoint...
I0422 16:39:12.547764 139095656052544 checkpointing.py:798] Waited 0.0038564205169677734 seconds for step 9 to finish before starting checkpointing.
I0422 16:39:12.550843 139095656052544 checkpoint_manager.py:2024] [process=3][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 16:39:13.078036    2612 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0422 16:39:15.081387 138975939389184 array_metadata_store.py:203] [process=3][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_3
I0422 16:39:43.358095 138975930996480 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/gbytes_per_sec: 27.523 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 57.391308307647705 s) (per-host)
I0422 16:39:43.358220 138975930996480 async_checkpointer.py:90] [process=3][thread=async_save] 3 Handler Commit operations completed. Time taken: 56.542325s.
I0422 16:39:51.123001 138975930996480 async_checkpointer.py:144] [process=3][thread=async_save] Background save thread done. Time taken: 64.307090s.
I0422 16:39:51.123322 138972181264128 async_checkpointer.py:273] [process=3][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 16:39:51.123454 138972181264128 async_checkpointer.py:283] [process=3][thread=save_finalize] No errors found in background save thread=async_save.
I0422 16:39:51.123505 138972181264128 checkpoint_manager.py:2133] [process=3][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0422 16:39:51.125434 138972181264128 checkpoint_manager.py:2142] [process=3][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0422 16:39:51.125653 139095656052544 checkpoint_manager.py:2036] [process=3][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0422 16:39:51.125833 139095656052544 checkpoint_manager.py:1458] Waiting for previous save to complete took 38.574992 seconds. If this number is high, consider checkpointing less frequently.
I0422 16:39:51.128347 139095656052544 checkpoint_manager.py:1518] [process=3] Saving checkpoint at step 9
I0422 16:39:51.131750 139095656052544 async_checkpointer.py:452] [process=3] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints/9.
I0422 16:39:51.705780 139095656052544 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0422 16:39:51.706493 139095656052544 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 16:39:51.858508 139095656052544 base_pytree_checkpoint_handler.py:153] [process=3][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.154256s
I0422 16:39:51.858672 139095656052544 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/blocking_gbytes_per_sec: 3.531 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.43680906295776367 s) (per-host)
I0422 16:39:51.858721 139095656052544 base_pytree_checkpoint_handler.py:737] [process=3][thread=MainThread] Initiated Pytree async_save. Time taken: 0.436868s (batch_requests_ready=0.264885s, total_serialization_initiated=0.171917s, others=0.000067s)
I0422 16:39:51.859021 139095656052544 composite_checkpoint_handler.py:715] [process=3][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.441178s (all_items=0.000015s, per_item={'items': '0.00001478'}, temp_paths=0.441163)
I0422 16:39:51.860029 138973779683072 async_checkpointer.py:79] [process=3][thread=async_save] Background save thread started.
I0422 16:39:51.860188 139095656052544 async_checkpointer.py:561] Finished blocking save. Time taken: 0.731766s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints/9.
I0422 16:39:51.869113 139095656052544 checkpoint_manager.py:1566] [process=3][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 16:39:51.869398 138972181264128 async_checkpointer.py:265] [process=3][thread=save_finalize] Waiting for background save thread=async_save.
I0422 16:39:51.869559 139095656052544 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776875952.550813, 'wait_for_prev_duration_secs': 38.574992418289185, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776875991.1283877, 'checkpointer_blocking_duration_secs': 0.7319085597991943, 'get_old_steps_start_time': 1776875991.860318, 'get_old_steps_duration_secs': 2.6702880859375e-05, 'checkpoint_manager_blocking_start_time': 1776875952.5487378, 'checkpoint_manager_blocking_duration_secs': 39.32078742980957}
I0422 16:39:51.869757 139095656052544 checkpointing.py:409] Started an asynchronous checkpoint save for step 9
I0422 16:39:51.869810 139095656052544 checkpoint_manager.py:2024] [process=3][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 16:39:56.689564 138975939389184 array_metadata_store.py:203] [process=3][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260422_154647/linen_xpk_feat_nnx_set_defaults_true_20260422_154647_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_3
I0422 16:40:33.468759 138973779683072 base_pytree_checkpoint_handler.py:129] [process=3] /jax/checkpoint/write/gbytes_per_sec: 37.568 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.04685878753662 s) (per-host)
I0422 16:40:33.468870 138973779683072 async_checkpointer.py:90] [process=3][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.608732s.
I0422 16:40:40.838405 138973779683072 async_checkpointer.py:144] [process=3][thread=async_save] Background save thread done. Time taken: 48.978251s.
I0422 16:40:40.838738 138972181264128 async_checkpointer.py:273] [process=3][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 16:40:40.838868 138972181264128 async_checkpointer.py:283] [process=3][thread=save_finalize] No errors found in background save thread=async_save.
I0422 16:40:40.838917 138972181264128 checkpoint_manager.py:2133] [process=3][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0422 16:40:40.840430 138972181264128 checkpoint_manager.py:2142] [process=3][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0422 16:40:40.840606 139095656052544 checkpoint_manager.py:2036] [process=3][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0422 16:40:40.840752 139095656052544 checkpoint_manager.py:2013] [process=3][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 16:40:40.841423 139095656052544 metric_logger.py:185] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.587, Tokens/s/device: 15010.811, total_weights: 65536, loss: 8.188
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Wed Apr 22 16:40:51 UTC 2026
EXIT_CODE=0