MaxView

‹ 11_optimizer_offload_trueCase: 13_scan_layers_false13_scan_layers_true ›

Metrics: Linen vs NNX  ·  main

MetricLinen  574ad3fb9NNX  574ad3fb9Diff (NNX − Linen)
Parameters1.104 billion
Final loss8.1880
TFLOP/s99.847
Tok/s15050.1
Avg s/step3.091
Memory %2.62
JAX0.9.20.9.2

Diff = NNX value − Linen value. Green = NNX improved. Red = NNX regressed.

Linen  ·  574ad3fb9  ·  main_20260418_180002  ·  full log
XPK Start: Sat Apr 18 18:56:50 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-18 18:57:14.752594: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0418 18:57:14.932849 136152602167104 max_utils.py:273] Attempting to initialize the jax distributed system...
I0418 18:57:23.973790 136152602167104 distributed.py:149] Starting JAX distributed service on [::]:8482
I0418 18:57:23.976205 136152602167104 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-pyte2-slice-job-0-0.mt-13-scan-layers-false-pyte2:8482
I0418 18:57:25.569263 136152602167104 max_utils.py:284] Jax distributed system initialized!
I0418 18:57:31.594362 136152602167104 max_utils.py:800] System Information: Jax Version: 0.9.2
I0418 18:57:31.594462 136152602167104 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0418 18:57:31.594502 136152602167104 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0418 18:57:31.594536 136152602167104 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0418 18:57:32.290705 136152602167104 maxtext_utils.py:1551] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0418 18:57:32.290979 136152602167104 checkpointing.py:677] Setting up checkpoint logger...
I0418 18:57:32.291036 136152602167104 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0418 18:57:32.291102 136152602167104 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0418 18:57:32.291444 136152602167104 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7bd3f0b01a90>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0418 18:57:36.489501 136152602167104 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0418 18:57:36.489735 136152602167104 checkpoint_manager.py:708] [process=7][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7bd2bc23d640>}, handler_registry=None
I0418 18:57:36.489973 136152602167104 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7bd2bc23d640>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0418 18:57:36.490020 136152602167104 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7bbfb80c0290>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0418 18:57:36.490055 136152602167104 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7bd2bc23d640>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7bd2bc23d640>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7bbfb80c0290>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7bbfb80c0290>}).
I0418 18:57:36.490385 136152602167104 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0418 18:57:36.490456 136152602167104 async_checkpointer.py:192] [process=7][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7bbf305dbd80> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0418 18:57:38.061012 136152602167104 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints
I0418 18:57:38.500329 136152602167104 checkpoint_manager.py:929] [process=7][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7bbfb821e3c0>
I0418 18:57:38.500499 136152602167104 checkpointing.py:301] Checkpoint manager created!
I0418 18:57:39.442837 136152602167104 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0418 18:57:39.442944 136152602167104 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0418 18:57:39.823328 136152602167104 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0418 18:57:39.823416 136152602167104 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0418 18:57:39.839395 136152602167104 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0418 18:57:39.839462 136152602167104 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0418 18:57:39.869009 136152602167104 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0418 18:57:39.869091 136152602167104 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 18:57:39.885274 136152602167104 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0418 18:57:39.885334 136152602167104 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 18:57:39.901247 136152602167104 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0418 18:57:39.901306 136152602167104 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 18:57:39.930583 136152602167104 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0418 18:57:39.930654 136152602167104 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 18:57:39.952971 136152602167104 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0418 18:57:39.953032 136152602167104 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0418 18:57:42.779659 136152602167104 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0418 18:57:42.779789 136152602167104 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0418 18:57:49.954679 136152602167104 maxtext_utils.py:1654]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.954814 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.954869 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.954926 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.954966 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955003 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955055 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955122 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.955164 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955201 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955236 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.955272 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.955306 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.955337 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955369 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955403 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955438 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.955472 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955505 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955536 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.955567 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.955598 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.955629 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955657 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955690 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955721 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.955751 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955782 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955817 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.955849 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.955880 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.955909 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955937 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.955967 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.955997 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.956027 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956056 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956099 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956132 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956167 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.956197 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.956227 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.956258 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956290 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.956323 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956354 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956385 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956416 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956447 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.956477 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.956506 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.956537 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956567 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.956598 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956627 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956657 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956687 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956718 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.956747 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.956775 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.956809 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956839 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.956869 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956899 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.956930 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956959 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.956988 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.957016 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957044 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957084 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957119 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.957151 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957182 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957213 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.957244 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.957274 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.957302 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957329 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957360 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957393 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.957427 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957458 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957490 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.957521 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.957551 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.957580 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957608 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957639 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957670 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.957703 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957734 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]

    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957766 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.957800 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.957832 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.957860 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957888 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.957917 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.957947 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.957977 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958006 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958035 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958064 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958109 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.958138 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958166 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958196 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958228 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.958259 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958289 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958318 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958348 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958378 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.958405 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958433 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958464 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958494 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.958524 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958553 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958583 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958612 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958642 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.958669 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958697 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958726 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958755 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.958785 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958819 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.958851 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958882 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.958911 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.958938 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958966 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.958996 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.959025 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.959055 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.959105 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.959139 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.959170 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   P('embed', 'mlp')
    Physical:  ('fsdp', None)
I0418 18:57:49.959201 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   P('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0418 18:57:49.959230 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.959258 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0418 18:57:49.959289 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.959318 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   P('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0418 18:57:49.959347 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.959376 136152602167104 maxtext_utils.py:1654]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   P('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0418 18:57:49.959426 136152602167104 maxtext_utils.py:1654]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0418 18:57:49.959471 136152602167104 maxtext_utils.py:1654]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0418 18:57:54.420623 136152602167104 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0418 18:57:54.420832 136152602167104 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0418 18:57:54.436335 136152602167104 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0418 18:57:54.436399 136152602167104 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0418 18:58:51.330861 136152602167104 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0418 18:58:51.331996 136152602167104 metric_logger.py:301] number parameters: 1.104 billion
I0418 18:59:52.267959 136152602167104 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0418 18:59:52.505463 136152602167104 checkpointing.py:776] Waited 0.23748469352722168 seconds for step 0 to finish before starting checkpointing.
I0418 18:59:52.508656 136152602167104 checkpoint_manager.py:2009] [process=7][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0418 18:59:52.510640 136152602167104 checkpoint_manager.py:1512] [process=7] Saving checkpoint at step 0
I0418 18:59:52.512056 136152602167104 event_tracking.py:70] [process=7] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints/0.
I0418 18:59:52.876470 136152602167104 signaling_client.py:364] Using JaxDistributedSignalingClient
I0418 18:59:52.877538 136152602167104 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0418 18:59:52.877661 136152602167104 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0418 18:59:53.179359 136152602167104 base_pytree_checkpoint_handler.py:154] [process=7][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.303424s
I0418 18:59:53.179529 136152602167104 base_pytree_checkpoint_handler.py:130] [process=7] /jax/orbax/write/blocking_gbytes_per_sec: 4.570 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.33757925033569336 s) (per-host)
I0418 18:59:53.179580 136152602167104 base_pytree_checkpoint_handler.py:768] [process=7][thread=MainThread] Initiated Pytree async_save. Time taken: 0.337639s (batch_requests_ready=0.017357s, total_serialization_initiated=0.320213s, others=0.000068s)
I0418 18:59:53.179841 136152602167104 composite_checkpoint_handler.py:715] [process=7][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.341888s (all_items=0.000024s, per_item={'items': '0.00002384'}, temp_paths=0.341864)
I0418 18:59:53.180628 136152602167104 event_tracking.py:125] [process=7] [async] Finished blocking save in 0.67 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints/0.
I0418 18:59:53.180979 136024759211776 async_checkpointer.py:76] [process=7][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-18 19:19:53.180939
I0418 18:59:53.231308 136152602167104 checkpoint_manager.py:1560] [process=7][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0418 18:59:53.231671 136024731928320 async_checkpointer.py:280] [process=7][thread=save_finalize] Waiting for background save thread=async_save.
I0418 18:59:53.231834 136152602167104 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776538792.5086386, 'wait_for_prev_duration_secs': 5.9604644775390625e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776538792.510678, 'checkpointer_blocking_duration_secs': 0.6704204082489014, 'get_old_steps_start_time': 1776538793.1811235, 'get_old_steps_duration_secs': 3.123283386230469e-05, 'checkpoint_manager_blocking_start_time': 1776538792.5065403, 'checkpoint_manager_blocking_duration_secs': 0.725252628326416}
I0418 18:59:53.232034 136152602167104 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0418 18:59:53.232103 136152602167104 max_utils.py:750] 
Memstats: After params initialized:
I0418 18:59:53.232158 136152602167104 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_26(process=7,(2,6,0,0))
I0418 18:59:53.232192 136152602167104 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_27(process=7,(3,6,0,0))
I0418 18:59:53.232220 136152602167104 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_30(process=7,(2,7,0,0))
I0418 18:59:53.232244 136152602167104 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_31(process=7,(3,7,0,0))
I0418 18:59:53.588800 136152602167104 metric_logger.py:196] completed step: 0, seconds: 60.936, TFLOP/s/device: 0.223, Tokens/s/device: 33.609, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0418 18:59:53.742103 136152602167104 metric_logger.py:196] completed step: 1, seconds: 1.312, TFLOP/s/device: 10.356, Tokens/s/device: 1560.926, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0418 18:59:54.157780 136152602167104 metric_logger.py:196] completed step: 2, seconds: 0.026, TFLOP/s/device: 526.347, Tokens/s/device: 79336.794, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28796.281
I0418 18:59:54.294134 136152602167104 metric_logger.py:196] completed step: 3, seconds: 0.416, TFLOP/s/device: 32.634, Tokens/s/device: 4919.021, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16999.387
I0418 18:59:54.570574 136152602167104 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.210, Tokens/s/device: 14351.083, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10779.387
I0418 18:59:54.582137 136152602167104 metric_logger.py:196] completed step: 5, seconds: 0.137, TFLOP/s/device: 99.288, Tokens/s/device: 14965.728, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.172
I0418 18:59:57.066171    2871 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0418 18:59:59.571658 136024740321024 array_metadata_store.py:203] [process=7][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_7
I0418 19:00:19.946502 136152602167104 metric_logger.py:196] completed step: 6, seconds: 0.277, TFLOP/s/device: 49.095, Tokens/s/device: 7400.127, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.530
I0418 19:00:20.082823 136152602167104 metric_logger.py:196] completed step: 7, seconds: 25.233, TFLOP/s/device: 0.538, Tokens/s/device: 81.164, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.210
I0418 19:00:20.219590 136152602167104 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.374, Tokens/s/device: 14375.763, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.529
I0418 19:00:20.355186 136152602167104 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0418 19:00:20.358473 136152602167104 checkpointing.py:776] Waited 0.003306150436401367 seconds for step 9 to finish before starting checkpointing.
I0418 19:00:20.361484 136152602167104 checkpoint_manager.py:2020] [process=7][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0418 19:00:25.191223 136024759211776 base_pytree_checkpoint_handler.py:130] [process=7] /jax/orbax/write/gbytes_per_sec: 48.830 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 32.349228858947754 s) (per-host)
I0418 19:00:25.191350 136024759211776 async_checkpointer.py:90] [process=7][thread=async_save] 3 Handler Commit operations completed. Time taken: 32.009579s.
I0418 19:00:38.102970 136024759211776 async_checkpointer.py:160] [process=7][thread=async_save] Background save thread done. Time taken: 44.921182s.
I0418 19:00:38.103275 136024731928320 async_checkpointer.py:288] [process=7][thread=save_finalize] Done with waiting for background save thread=async_save.
I0418 19:00:38.103402 136024731928320 async_checkpointer.py:298] [process=7][thread=save_finalize] No errors found in background save thread=async_save.
I0418 19:00:38.103451 136024731928320 checkpoint_manager.py:2137] [process=7][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0418 19:00:38.105328 136024731928320 checkpoint_manager.py:2146] [process=7][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0418 19:00:38.105505 136152602167104 checkpoint_manager.py:2032] [process=7][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0418 19:00:38.105658 136152602167104 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.744171 seconds. If this number is high, consider checkpointing less frequently.
I0418 19:00:38.108541 136152602167104 checkpoint_manager.py:1512] [process=7] Saving checkpoint at step 9
I0418 19:00:38.110665 136152602167104 event_tracking.py:70] [process=7] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints/9.
I0418 19:00:38.854331 136152602167104 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0418 19:00:38.854505 136152602167104 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0418 19:00:38.981114 136152602167104 base_pytree_checkpoint_handler.py:154] [process=7][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.128184s
I0418 19:00:38.981267 136152602167104 base_pytree_checkpoint_handler.py:130] [process=7] /jax/orbax/write/blocking_gbytes_per_sec: 9.588 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.1608879566192627 s) (per-host)
I0418 19:00:38.981317 136152602167104 base_pytree_checkpoint_handler.py:768] [process=7][thread=MainThread] Initiated Pytree async_save. Time taken: 0.160946s (batch_requests_ready=0.016801s, total_serialization_initiated=0.144079s, others=0.000066s)
I0418 19:00:38.981596 136152602167104 composite_checkpoint_handler.py:715] [process=7][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.165335s (all_items=0.000016s, per_item={'items': '0.00001621'}, temp_paths=0.165319)
I0418 19:00:38.982385 136152602167104 event_tracking.py:125] [process=7] [async] Finished blocking save in 0.87 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints/9.
I0418 19:00:38.982660 136024731928320 async_checkpointer.py:76] [process=7][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-18 19:20:38.982637
I0418 19:00:38.988909 136152602167104 checkpoint_manager.py:1560] [process=7][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0418 19:00:38.989223 136024223115008 async_checkpointer.py:280] [process=7][thread=save_finalize] Waiting for background save thread=async_save.
I0418 19:00:38.989390 136152602167104 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776538820.361453, 'wait_for_prev_duration_secs': 17.744171142578125, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776538838.1085835, 'checkpointer_blocking_duration_secs': 0.8741734027862549, 'get_old_steps_start_time': 1776538838.9827812, 'get_old_steps_duration_secs': 3.170967102050781e-05, 'checkpoint_manager_blocking_start_time': 1776538820.3593597, 'checkpoint_manager_blocking_duration_secs': 18.629996299743652}
I0418 19:00:38.989601 136152602167104 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0418 19:00:38.989650 136152602167104 checkpoint_manager.py:2020] [process=7][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0418 19:00:46.451692 136024742426368 array_metadata_store.py:203] [process=7][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260418_180002/linen_xpk_main_20260418_180002_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_7
I0418 19:01:22.428010 136024731928320 base_pytree_checkpoint_handler.py:130] [process=7] /jax/orbax/write/gbytes_per_sec: 36.223 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.607598304748535 s) (per-host)
I0418 19:01:22.428148 136024731928320 async_checkpointer.py:90] [process=7][thread=async_save] 3 Handler Commit operations completed. Time taken: 43.445414s.
I0418 19:01:31.829513 136024731928320 async_checkpointer.py:160] [process=7][thread=async_save] Background save thread done. Time taken: 52.846764s.
I0418 19:01:31.829781 136024223115008 async_checkpointer.py:288] [process=7][thread=save_finalize] Done with waiting for background save thread=async_save.
I0418 19:01:31.829897 136024223115008 async_checkpointer.py:298] [process=7][thread=save_finalize] No errors found in background save thread=async_save.
I0418 19:01:31.829946 136024223115008 checkpoint_manager.py:2137] [process=7][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0418 19:01:31.832213 136024223115008 checkpoint_manager.py:2146] [process=7][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0418 19:01:31.832396 136152602167104 checkpoint_manager.py:2032] [process=7][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0418 19:01:31.832546 136152602167104 checkpoint_manager.py:2009] [process=7][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0418 19:01:31.834918 136152602167104 metric_logger.py:196] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.847, Tokens/s/device: 15050.081, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.874
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Sat Apr 18 19:01:40 UTC 2026
EXIT_CODE=0
NNX  ·  574ad3fb9  ·  main_20260418_180002  ·  full log
XPK Start: Sat Apr 18 20:20:13 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-18 20:20:38.194261: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0418 20:20:38.374964 132605319829312 max_utils.py:273] Attempting to initialize the jax distributed system...
I0418 20:20:47.417838 132605319829312 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-nh2fd-slice-job-0-0.mt-13-scan-layers-false-nh2fd:8482
I0418 20:20:47.423214 132605319829312 max_utils.py:284] Jax distributed system initialized!
I0418 20:20:52.796474 132605319829312 max_utils.py:800] System Information: Jax Version: 0.9.2
I0418 20:20:52.796584 132605319829312 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0418 20:20:52.796622 132605319829312 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0418 20:20:52.796653 132605319829312 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 744, in <module>
    app.run(main)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 367, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 312, in _run_main
    sys.exit(main(argv))
             ^^^^^^^^^^
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 740, in main
    train_func()
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 730, in train_func
    run(config, recorder, diagnostic_config)
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 709, in run
    train_loop(config, recorder)
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 536, in train_loop
    ) = train_utils.setup_train_loop(config, recorder)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/deps/src/maxtext/utils/train_utils.py", line 217, in setup_train_loop
    raise NotImplementedError("Pure NNX support has not been implemented yet.")
NotImplementedError: Pure NNX support has not been implemented yet.
XPK End: Sat Apr 18 20:20:59 UTC 2026
EXIT_CODE=1