MaxView

‹ 11_optimizer_offload_trueCase: 13_scan_layers_false13_scan_layers_true ›

Metrics: Linen vs NNX  ·  test/pipeline-scan-nnx

MetricLinen  e27fc1e97NNX  e27fc1e97Diff (NNX − Linen)
Parameters1.104 billion
Final loss8.1880
TFLOP/s99.643
Tok/s15019.3
Avg s/step3.782
Memory %2.62
JAX0.8.10.8.1

Diff = NNX value − Linen value. Green = NNX improved. Red = NNX regressed.

Linen  ·  e27fc1e97  ·  test_pipeline_scan_nnx_20260421_005327  ·  full log
XPK Start: Tue Apr 21 01:34:39 UTC 2026
2026-04-21 01:34:43.659412: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1776735283.672276      10 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1776735283.675935      10 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1776735283.687092      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776735283.687112      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776735283.687114      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776735283.687116      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-21 01:35:02.802255: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0421 01:35:03.317375 137334409234240 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-21 01:35:12,357:jax._src.distributed:140: Starting JAX distributed service on [::]:8482
I0421 01:35:12.357978 137334409234240 distributed.py:140] Starting JAX distributed service on [::]:8482
INFO:2026-04-21 01:35:12,360:jax._src.distributed:157: Connecting to JAX distributed service on mt-13-scan-layers-false-iu30d-slice-job-0-0.mt-13-scan-layers-false-iu30d:8482
I0421 01:35:12.360414 137334409234240 distributed.py:157] Connecting to JAX distributed service on mt-13-scan-layers-false-iu30d-slice-job-0-0.mt-13-scan-layers-false-iu30d:8482
I0421 01:35:13.509511 137334409234240 max_utils.py:284] Jax distributed system initialized!
I0421 01:35:19.453481 137334409234240 max_utils.py:800] System Information: Jax Version: 0.8.1
I0421 01:35:19.453587 137334409234240 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0421 01:35:19.453628 137334409234240 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0421 01:35:19.453663 137334409234240 train_utils.py:347] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0421 01:35:20.069716 137334409234240 maxtext_utils.py:1398] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0421 01:35:20.070011 137334409234240 checkpointing.py:677] Setting up checkpoint logger...
I0421 01:35:20.070068 137334409234240 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0421 01:35:20.070112 137334409234240 pytree_checkpoint_handler.py:589] save_device_host_concurrent_bytes=None
I0421 01:35:20.070468 137334409234240 base_pytree_checkpoint_handler.py:415] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7ce7295752b0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0421 01:35:22.853481 137334409234240 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0421 01:35:22.853843 137334409234240 checkpoint_manager.py:709] [process=4][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd4bdbd4140>}, handler_registry=None
I0421 01:35:22.854078 137334409234240 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd4bdbd4140>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0421 01:35:22.854126 137334409234240 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7cd4bf003740>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0421 01:35:22.854168 137334409234240 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd4bdbd4140>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7cd4bdbd4140>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7cd4bf003740>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7cd4bf003740>}).
I0421 01:35:22.856835 137334409234240 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.33
I0421 01:35:22.856916 137334409234240 async_checkpointer.py:177] [process=4][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7ce7129fb2e0> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0421 01:35:24.621212 137334409234240 checkpoint_manager.py:1818] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints
I0421 01:35:25.010654 137334409234240 checkpoint_manager.py:929] [process=4][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7cd4beffb200>
I0421 01:35:25.010870 137334409234240 checkpointing.py:301] Checkpoint manager created!
I0421 01:35:25.938241 137334409234240 nnx_wrappers.py:453] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0421 01:35:25.938385 137334409234240 nnx_wrappers.py:453] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0421 01:35:26.739434 137334409234240 attentions.py:1084] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0421 01:35:26.739563 137334409234240 attentions.py:1084] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0421 01:35:26.755448 137334409234240 attentions.py:1085] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0421 01:35:26.755540 137334409234240 attentions.py:1085] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0421 01:35:26.782152 137334409234240 attentions.py:1150] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0421 01:35:26.782256 137334409234240 attentions.py:1150] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 01:35:26.798225 137334409234240 attentions.py:1151] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0421 01:35:26.798318 137334409234240 attentions.py:1151] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 01:35:26.814049 137334409234240 attentions.py:1152] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0421 01:35:26.814138 137334409234240 attentions.py:1152] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 01:35:26.839105 137334409234240 attentions.py:1193] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0421 01:35:26.839204 137334409234240 attentions.py:1193] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 01:35:26.860959 137334409234240 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0421 01:35:26.861052 137334409234240 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0421 01:35:29.895466 137334409234240 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0421 01:35:29.895593 137334409234240 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0421 01:35:38.642095 137334409234240 maxtext_utils.py:1501]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.642236 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.642292 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.642350 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.642390 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.642426 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.642479 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.642530 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.642568 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.642602 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_0/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.642638 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.642673 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.642722 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.642759 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.642792 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.642825 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.642858 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.642891 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.642923 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_1/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.642954 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.642985 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643015 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.643044 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643073 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643104 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643136 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.643167 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643197 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_10/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643232 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643262 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643291 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.643320 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643348 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643378 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643408 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.643437 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643466 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_11/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643496 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643525 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643554 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.643582 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643610 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643639 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643668 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.643707 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643739 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_12/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643769 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643799 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.643827 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.643872 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643920 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.643961 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.643993 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.644039 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644073 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_13/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644104 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644135 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644165 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.644193 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.644225 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.644255 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644285 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.644315 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644349 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_14/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644381 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644412 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644443 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.644471 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.644499 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.644528 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644557 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.644587 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644616 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_15/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644646 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644675 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644716 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.644747 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.644775 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.644806 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644836 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.644867 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644897 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_2/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.644927 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644955 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.644984 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.645013 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645041 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645070 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645099 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.645128 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645157 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_3/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645187 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.645220 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.645253 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.645281 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645308 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645337 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645367 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.645397 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645426 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_4/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645455 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.645484 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.645514 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.645541 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645572 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645602 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645631 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.645660 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645689 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_5/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645731 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.645760 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.645789 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.645817 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645845 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.645874 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645904 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.645934 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645964 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_6/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.645993 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.646023 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.646052 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.646080 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.646107 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.646136 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646165 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.646194 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646227 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_7/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646256 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.646285 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.646313 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.646340 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.646367 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.646399 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646429 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.646459 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646488 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_8/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646517 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/mlp/wi_0/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.646545 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/mlp/wi_1/kernel
    Shape:     float32[2048,7168]
    Logical:   PartitionSpec('embed', 'mlp')
    Physical:  ('fsdp', None)
I0421 01:35:38.646574 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/mlp/wo/kernel
    Shape:     float32[7168,2048]
    Logical:   PartitionSpec('mlp', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:38.646601 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/post_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.646628 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
    Shape:     float32[2048]
    Logical:   PartitionSpec('norm',)
    Physical:  (None,)
I0421 01:35:38.646657 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/self_attention/key/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646685 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/self_attention/out/kernel
    Shape:     float32[16,128,2048]
    Logical:   PartitionSpec('heads', 'kv', 'embed')
    Physical:  (None, None, 'fsdp')
I0421 01:35:38.646725 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/self_attention/query/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646755 137334409234240 maxtext_utils.py:1501]  params/params/decoder/layers_9/self_attention/value/kernel
    Shape:     float32[2048,16,128]
    Logical:   PartitionSpec('embed', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None)
I0421 01:35:38.646802 137334409234240 maxtext_utils.py:1501]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   PartitionSpec('embed', 'vocab')
    Physical:  ('fsdp', None)
I0421 01:35:38.646846 137334409234240 maxtext_utils.py:1501]  params/params/token_embedder/embedding

    Shape:     float32[32000,2048]
    Logical:   PartitionSpec('vocab', 'embed')
    Physical:  (None, 'fsdp')
I0421 01:35:42.382556 137334409234240 train.py:156] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0421 01:35:42.382651 137334409234240 train.py:156] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0421 01:35:42.397580 137334409234240 train.py:163] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0421 01:35:42.397639 137334409234240 train.py:163] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0421 01:36:38.684822 137334409234240 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.4 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0421 01:36:38.688319 137334409234240 metric_logger.py:289] number parameters: 1.104 billion
I0421 01:37:40.147470 137334409234240 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0421 01:37:40.436053 137334409234240 checkpointing.py:776] Waited 0.28856515884399414 seconds for step 0 to finish before starting checkpointing.
I0421 01:37:40.441767 137334409234240 checkpoint_manager.py:2013] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0421 01:37:40.443718 137334409234240 checkpoint_manager.py:1518] [process=4] Saving checkpoint at step 0
I0421 01:37:40.446374 137334409234240 async_checkpointer.py:452] [process=4] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints/0.
I0421 01:37:42.849986 137334409234240 signaling_client.py:364] Using JaxDistributedSignalingClient
I0421 01:37:42.850993 137334409234240 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0421 01:37:42.851110 137334409234240 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0421 01:37:43.157020 137334409234240 base_pytree_checkpoint_handler.py:153] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.307524s
I0421 01:37:43.157195 137334409234240 base_pytree_checkpoint_handler.py:129] [process=4] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.036 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 1.4887464046478271 s) (per-host)
I0421 01:37:43.157243 137334409234240 base_pytree_checkpoint_handler.py:737] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 1.488804s (batch_requests_ready=1.161432s, total_serialization_initiated=0.327306s, others=0.000067s)
I0421 01:37:43.157516 137334409234240 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 1.493207s (all_items=0.000019s, per_item={'items': '0.00001931'}, temp_paths=1.493188)
I0421 01:37:43.158544 137215078299392 async_checkpointer.py:79] [process=4][thread=async_save] Background save thread started.
I0421 01:37:43.158732 137334409234240 async_checkpointer.py:561] Finished blocking save. Time taken: 2.714930s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints/0.
I0421 01:37:43.191020 137334409234240 checkpoint_manager.py:1566] [process=4][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0421 01:37:43.191340 137211311810304 async_checkpointer.py:265] [process=4][thread=save_finalize] Waiting for background save thread=async_save.
I0421 01:37:43.191505 137334409234240 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776735460.4417486, 'wait_for_prev_duration_secs': 6.29425048828125e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776735460.4437566, 'checkpointer_blocking_duration_secs': 2.7150843143463135, 'get_old_steps_start_time': 1776735463.1588614, 'get_old_steps_duration_secs': 2.5033950805664062e-05, 'checkpoint_manager_blocking_start_time': 1776735460.4393735, 'checkpoint_manager_blocking_duration_secs': 2.752092123031616}
I0421 01:37:43.191725 137334409234240 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0421 01:37:43.191779 137334409234240 max_utils.py:750] 
Memstats: After params initialized:
I0421 01:37:43.191829 137334409234240 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_16(process=4,(0,4,0,0))
I0421 01:37:43.191862 137334409234240 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_17(process=4,(1,4,0,0))
I0421 01:37:43.191888 137334409234240 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_20(process=4,(0,5,0,0))
I0421 01:37:43.191914 137334409234240 max_utils.py:756] 	Using (GB) 0.82 / 31.25 (2.624000%) on TPU_21(process=4,(1,5,0,0))
I0421 01:37:43.501412 137334409234240 metric_logger.py:185] completed step: 0, seconds: 61.459, TFLOP/s/device: 0.221, Tokens/s/device: 33.323, total_weights: 65536, loss: 10.877
I0421 01:37:43.647093 137334409234240 metric_logger.py:185] completed step: 1, seconds: 3.342, TFLOP/s/device: 4.065, Tokens/s/device: 612.722, total_weights: 65536, loss: 10.877
I0421 01:37:44.491121 137334409234240 metric_logger.py:185] completed step: 2, seconds: 0.021, TFLOP/s/device: 643.696, Tokens/s/device: 97024.825, total_weights: 65536, loss: 10.268
I0421 01:37:44.627758 137334409234240 metric_logger.py:185] completed step: 3, seconds: 0.845, TFLOP/s/device: 16.076, Tokens/s/device: 2423.167, total_weights: 65536, loss: 9.741
I0421 01:37:44.905167 137334409234240 metric_logger.py:185] completed step: 4, seconds: 0.142, TFLOP/s/device: 95.401, Tokens/s/device: 14379.902, total_weights: 65536, loss: 9.285
I0421 01:37:44.917897 137334409234240 metric_logger.py:185] completed step: 5, seconds: 0.137, TFLOP/s/device: 99.412, Tokens/s/device: 14984.452, total_weights: 65536, loss: 8.900
I0421 01:38:14.043709 137334409234240 metric_logger.py:185] completed step: 6, seconds: 0.279, TFLOP/s/device: 48.754, Tokens/s/device: 7348.693, total_weights: 65536, loss: 8.602
I0421 01:38:14.180194 137334409234240 metric_logger.py:185] completed step: 7, seconds: 28.995, TFLOP/s/device: 0.469, Tokens/s/device: 70.632, total_weights: 65536, loss: 8.393
I0421 01:38:14.316470 137334409234240 metric_logger.py:185] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.705, Tokens/s/device: 14425.684, total_weights: 65536, loss: 8.264
I0421 01:38:14.452510 137334409234240 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0421 01:38:14.456255 137334409234240 checkpointing.py:776] Waited 0.0037653446197509766 seconds for step 9 to finish before starting checkpointing.
I0421 01:38:14.458996 137334409234240 checkpoint_manager.py:2024] [process=4][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0421 01:38:19.589834    2597 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0421 01:38:23.281869 137215564814080 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_4
I0421 01:38:47.509600 137215078299392 base_pytree_checkpoint_handler.py:129] [process=4] /jax/checkpoint/write/gbytes_per_sec: 23.991 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 65.84109377861023 s) (per-host)
I0421 01:38:47.509735 137215078299392 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 64.351079s.
I0421 01:39:04.281600 137215078299392 async_checkpointer.py:144] [process=4][thread=async_save] Background save thread done. Time taken: 81.122928s.
I0421 01:39:04.281894 137211311810304 async_checkpointer.py:273] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save.
I0421 01:39:04.282029 137211311810304 async_checkpointer.py:283] [process=4][thread=save_finalize] No errors found in background save thread=async_save.
I0421 01:39:04.282093 137211311810304 checkpoint_manager.py:2133] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0421 01:39:04.283929 137211311810304 checkpoint_manager.py:2142] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0421 01:39:04.284109 137334409234240 checkpoint_manager.py:2036] [process=4][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0421 01:39:04.284261 137334409234240 checkpoint_manager.py:1458] Waiting for previous save to complete took 49.825270 seconds. If this number is high, consider checkpointing less frequently.
I0421 01:39:04.286471 137334409234240 checkpoint_manager.py:1518] [process=4] Saving checkpoint at step 9
I0421 01:39:04.289938 137334409234240 async_checkpointer.py:452] [process=4] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints/9.
I0421 01:39:05.712441 137334409234240 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array.
I0421 01:39:05.712607 137334409234240 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0421 01:39:05.865183 137334409234240 base_pytree_checkpoint_handler.py:153] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.154289s
I0421 01:39:05.865350 137334409234240 base_pytree_checkpoint_handler.py:129] [process=4] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.796 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.8589479923248291 s) (per-host)
I0421 01:39:05.865402 137334409234240 base_pytree_checkpoint_handler.py:737] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.859009s (batch_requests_ready=0.687263s, total_serialization_initiated=0.171677s, others=0.000068s)
I0421 01:39:05.865676 137334409234240 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.863186s (all_items=0.000016s, per_item={'items': '0.00001621'}, temp_paths=0.863169)
I0421 01:39:05.866667 137212347049728 async_checkpointer.py:79] [process=4][thread=async_save] Background save thread started.
I0421 01:39:05.866846 137334409234240 async_checkpointer.py:561] Finished blocking save. Time taken: 1.580301s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints/9.
I0421 01:39:06.295244 137334409234240 checkpoint_manager.py:1566] [process=4][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0421 01:39:06.295610 137211311810304 async_checkpointer.py:265] [process=4][thread=save_finalize] Waiting for background save thread=async_save.
I0421 01:39:06.295782 137334409234240 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776735494.4589672, 'wait_for_prev_duration_secs': 49.82526969909668, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776735544.2865112, 'checkpointer_blocking_duration_secs': 1.5804436206817627, 'get_old_steps_start_time': 1776735545.8669748, 'get_old_steps_duration_secs': 2.574920654296875e-05, 'checkpoint_manager_blocking_start_time': 1776735494.4571831, 'checkpoint_manager_blocking_duration_secs': 51.83855986595154}
I0421 01:39:06.296030 137334409234240 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0421 01:39:06.296078 137334409234240 checkpoint_manager.py:2024] [process=4][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0421 01:39:12.272949 137215564814080 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260421_005327/linen_xpk_test_pipeline_scan_nnx_20260421_005327_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_4
I0421 01:39:47.731044 137212347049728 base_pytree_checkpoint_handler.py:129] [process=4] /jax/checkpoint/write/gbytes_per_sec: 36.972 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.724599838256836 s) (per-host)
I0421 01:39:47.731178 137212347049728 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.864370s.
I0421 01:39:57.942441 137212347049728 async_checkpointer.py:144] [process=4][thread=async_save] Background save thread done. Time taken: 52.075629s.
I0421 01:39:57.942726 137211311810304 async_checkpointer.py:273] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save.
I0421 01:39:57.942849 137211311810304 async_checkpointer.py:283] [process=4][thread=save_finalize] No errors found in background save thread=async_save.
I0421 01:39:57.942899 137211311810304 checkpoint_manager.py:2133] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0421 01:39:57.945716 137211311810304 checkpoint_manager.py:2142] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0421 01:39:57.945915 137334409234240 checkpoint_manager.py:2036] [process=4][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0421 01:39:57.946062 137334409234240 checkpoint_manager.py:2013] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0421 01:39:57.946669 137334409234240 metric_logger.py:185] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.643, Tokens/s/device: 15019.287, total_weights: 65536, loss: 8.188
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Tue Apr 21 01:40:09 UTC 2026
EXIT_CODE=0
NNX  ·  e27fc1e97  ·  test_pipeline_scan_nnx_20260421_005327  ·  full log
XPK Start: Tue Apr 21 02:51:36 UTC 2026
2026-04-21 02:51:40.753685: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1776739900.766157      10 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1776739900.769947      10 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1776739900.781168      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776739900.781187      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776739900.781189      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776739900.781191      10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-21 02:52:00.066144: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0421 02:52:00.581740 137011814606656 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-21 02:52:09,620:jax._src.distributed:140: Starting JAX distributed service on [::]:8482
I0421 02:52:09.620571 137011814606656 distributed.py:140] Starting JAX distributed service on [::]:8482
INFO:2026-04-21 02:52:09,626:jax._src.distributed:157: Connecting to JAX distributed service on mt-13-scan-layers-false-6nog1-slice-job-0-0.mt-13-scan-layers-false-6nog1:8482
I0421 02:52:09.626142 137011814606656 distributed.py:157] Connecting to JAX distributed service on mt-13-scan-layers-false-6nog1-slice-job-0-0.mt-13-scan-layers-false-6nog1:8482
I0421 02:52:10.974238 137011814606656 max_utils.py:284] Jax distributed system initialized!
I0421 02:52:17.870186 137011814606656 max_utils.py:800] System Information: Jax Version: 0.8.1
I0421 02:52:17.870291 137011814606656 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0421 02:52:17.870332 137011814606656 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0421 02:52:17.870367 137011814606656 train_utils.py:347] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 727, in <module>
    app.run(main)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 316, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 261, in _run_main
    sys.exit(main(argv))
             ^^^^^^^^^^
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 723, in main
    train_func()
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 713, in train_func
    run(config, recorder, diagnostic_config)
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 692, in run
    train_loop(config, recorder)
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 517, in train_loop
    ) = train_utils.setup_train_loop(config, recorder)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/deps/src/maxtext/utils/train_utils.py", line 217, in setup_train_loop
    raise NotImplementedError("Pure NNX support has not been implemented yet.")
NotImplementedError: Pure NNX support has not been implemented yet.
XPK End: Tue Apr 21 02:52:28 UTC 2026
EXIT_CODE=1