XPK Start: Wed Apr 22 13:23:18 UTC 2026 PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. `rope_parameters`'s factor field must be a float >= 1, got 40 `rope_parameters`'s beta_fast field must be a float, got 32 `rope_parameters`'s beta_slow field must be a float, got 1 DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 2026-04-22 13:23:43.386096: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0422 13:23:43.593869 132378160076608 max_utils.py:273] Attempting to initialize the jax distributed system... I0422 13:23:52.635719 132378160076608 distributed.py:149] Starting JAX distributed service on [::]:8482 I0422 13:23:52.638094 132378160076608 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-qcwdu-slice-job-0-0.mt-13-scan-layers-false-qcwdu:8482 I0422 13:23:53.601413 132378160076608 max_utils.py:284] Jax distributed system initialized! I0422 13:23:59.899623 132378160076608 max_utils.py:800] System Information: Jax Version: 0.9.2 I0422 13:23:59.899746 132378160076608 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0422 13:23:59.899788 132378160076608 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365 I0422 13:23:59.899824 132378160076608 train_utils.py:391] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing. I0422 13:24:00.603311 132378160076608 maxtext_utils.py:1771] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0422 13:24:00.603905 132378160076608 maxtext_utils.py:1771] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0422 13:24:00.604092 132378160076608 checkpointing.py:688] Setting up checkpoint logger... I0422 13:24:00.604143 132378160076608 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True I0422 13:24:00.604186 132378160076608 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None I0422 13:24:00.604530 132378160076608 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x78651d3185f0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0422 13:24:03.876560 132378160076608 checkpointing.py:266] Enabling policy for fixed interval checkpointing. I0422 13:24:03.876812 132378160076608 checkpoint_manager.py:708] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7850d42eaba0>}, handler_registry=None I0422 13:24:03.877070 132378160076608 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7850d42eaba0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0422 13:24:03.877121 132378160076608 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7850d42ecd10>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0422 13:24:03.877159 132378160076608 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7850d42eaba0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7850d42eaba0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7850d42ecd10>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7850d42ecd10>}). I0422 13:24:03.877492 132378160076608 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34 I0422 13:24:03.877562 132378160076608 async_checkpointer.py:192] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x785084681d00> timeout: 1200 secs and primary_host=0 for async checkpoint writes I0422 13:24:04.567494 132378160076608 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints I0422 13:24:05.015913 132378160076608 checkpoint_manager.py:929] [process=5][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7863e85b5370> I0422 13:24:05.016089 132378160076608 checkpointing.py:302] Checkpoint manager created! I0422 13:24:05.958898 132378160076608 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0422 13:24:05.959011 132378160076608 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0422 13:24:06.339712 132378160076608 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0422 13:24:06.339808 132378160076608 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0422 13:24:06.355871 132378160076608 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0422 13:24:06.355932 132378160076608 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0422 13:24:06.386254 132378160076608 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0422 13:24:06.386321 132378160076608 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 13:24:06.402510 132378160076608 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0422 13:24:06.402572 132378160076608 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 13:24:06.418539 132378160076608 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0422 13:24:06.418603 132378160076608 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 13:24:06.446810 132378160076608 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0422 13:24:06.446878 132378160076608 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 13:24:06.468637 132378160076608 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0422 13:24:06.468707 132378160076608 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None). I0422 13:24:09.364503 132378160076608 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint I0422 13:24:09.364629 132378160076608 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint. fsdp: 32 I0422 13:24:16.612391 132378160076608 maxtext_utils.py:1880] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.612524 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.612580 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.612638 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.612693 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.612732 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.612786 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.612838 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.612879 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.612914 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_0/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.612948 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.612988 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613023 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.613055 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613086 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613119 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.613153 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.613187 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.613219 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_1/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.613250 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613281 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613312 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.613343 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613372 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613403 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.613434 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.613465 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.613495 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_10/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.613525 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613557 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613587 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.613615 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613643 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613687 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.613718 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.613765 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.613798 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_11/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.613829 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613859 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.613892 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.613920 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613950 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.613987 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614018 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.614049 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.614078 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_12/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614109 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614138 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614167 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.614195 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.614222 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.614252 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614283 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.614313 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.614344 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_13/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614374 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614404 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614434 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.614461 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.614489 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.614519 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614549 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.614578 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.614607 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_14/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614636 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614676 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614707 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.614735 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.614763 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.614792 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614822 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.614852 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.614881 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_15/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.614911 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614949 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.614986 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.615015 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615043 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615073 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615104 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.615134 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.615164 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_2/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615194 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.615224 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.615253 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.615281 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615309 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615339 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615369 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.615398 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.615428 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_3/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615458 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.615488 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.615517 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.615544 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615572 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615603 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615632 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.615672 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.615701 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_4/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615731 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.615760 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.615789 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.615816 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615843 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.615873 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615902 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.615931 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.615960 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_5/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.615995 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616023 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616052 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.616080 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616107 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616136 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616165 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.616196 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.616224 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_6/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616253 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616282 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616312 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.616339 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616364 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616393 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616423 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.616452 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.616480 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_7/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616509 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616537 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616566 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.616593 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616621 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616659 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616693 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.616723 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.616753 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_8/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616782 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616811 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 13:24:16.616839 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 13:24:16.616867 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616895 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 13:24:16.616925 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.616954 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 13:24:16.616988 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 13:24:16.617018 132378160076608 maxtext_utils.py:1880] params/params/decoder/layers_9/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 13:24:16.617066 132378160076608 maxtext_utils.py:1880] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: P('embed_vocab', 'vocab') Physical: ('fsdp', None) I0422 13:24:16.617111 132378160076608 maxtext_utils.py:1880] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: P('vocab', 'embed_vocab') Physical: (None, 'fsdp') I0422 13:24:21.177808 132378160076608 train.py:157] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0422 13:24:21.177904 132378160076608 train.py:157] train/xent Physical: float32[32,2048]............................................ ('fsdp', None). I0422 13:24:21.193110 132378160076608 train.py:164] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0422 13:24:21.193171 132378160076608 train.py:164] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None). I0422 13:25:17.790508 132378160076608 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB. I0422 13:25:17.791639 132378160076608 metric_logger.py:301] number parameters: 1.104 billion I0422 13:26:18.485188 132378160076608 checkpointing.py:794] Waiting for step 0 to finish before checkpoint... I0422 13:26:18.738399 132378160076608 checkpointing.py:798] Waited 0.2531907558441162 seconds for step 0 to finish before starting checkpointing. I0422 13:26:18.741797 132378160076608 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0422 13:26:18.743723 132378160076608 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 0 I0422 13:26:18.745284 132378160076608 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints/0. I0422 13:26:19.066252 132378160076608 signaling_client.py:364] Using JaxDistributedSignalingClient I0422 13:26:19.067290 132378160076608 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array. I0422 13:26:19.067415 132378160076608 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0422 13:26:19.359486 132378160076608 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.293746s I0422 13:26:19.359671 132378160076608 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 4.697 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3284459114074707 s) (per-host) I0422 13:26:19.359724 132378160076608 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.328527s (batch_requests_ready=0.018149s, total_serialization_initiated=0.310289s, others=0.000088s) I0422 13:26:19.359965 132378160076608 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.332703s (all_items=0.000018s, per_item={'items': '0.00001836'}, temp_paths=0.332684) I0422 13:26:19.360793 132378160076608 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.62 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints/0. I0422 13:26:19.361157 132250081601280 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 13:46:19.361119 I0422 13:26:19.410829 132378160076608 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0422 13:26:19.411204 132250054317824 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save. I0422 13:26:19.411371 132378160076608 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776864378.7417789, 'wait_for_prev_duration_secs': 5.9604644775390625e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776864378.7437637, 'checkpointer_blocking_duration_secs': 0.6175563335418701, 'get_old_steps_start_time': 1776864379.361346, 'get_old_steps_duration_secs': 3.0040740966796875e-05, 'checkpoint_manager_blocking_start_time': 1776864378.7399783, 'checkpoint_manager_blocking_duration_secs': 0.6713509559631348} I0422 13:26:19.411585 132378160076608 checkpointing.py:409] Started an asynchronous checkpoint save for step 0 I0422 13:26:19.411640 132378160076608 max_utils.py:750] Memstats: After params initialized: I0422 13:26:19.411707 132378160076608 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_18(process=5,(2,4,0,0)) I0422 13:26:19.411743 132378160076608 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_19(process=5,(3,4,0,0)) I0422 13:26:19.411771 132378160076608 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_22(process=5,(2,5,0,0)) I0422 13:26:19.411797 132378160076608 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_23(process=5,(3,5,0,0)) I0422 13:26:19.769793 132378160076608 metric_logger.py:196] completed step: 0, seconds: 60.693, TFLOP/s/device: 0.224, Tokens/s/device: 33.743, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617 I0422 13:26:19.906059 132378160076608 metric_logger.py:196] completed step: 1, seconds: 1.276, TFLOP/s/device: 10.651, Tokens/s/device: 1605.389, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617 I0422 13:26:20.328137 132378160076608 metric_logger.py:196] completed step: 2, seconds: 0.015, TFLOP/s/device: 927.259, Tokens/s/device: 139766.601, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28794.738 I0422 13:26:20.464460 132378160076608 metric_logger.py:196] completed step: 3, seconds: 0.418, TFLOP/s/device: 32.535, Tokens/s/device: 4904.050, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16998.576 I0422 13:26:20.744245 132378160076608 metric_logger.py:196] completed step: 4, seconds: 0.142, TFLOP/s/device: 95.657, Tokens/s/device: 14418.474, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10779.387 I0422 13:26:20.756177 132378160076608 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.811, Tokens/s/device: 15044.553, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.090 I0422 13:26:23.205962 2814 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com I0422 13:26:25.898695 132250062710528 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_5 I0422 13:26:44.644097 132378160076608 metric_logger.py:196] completed step: 6, seconds: 0.281, TFLOP/s/device: 48.360, Tokens/s/device: 7289.372, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.962 I0422 13:26:44.780430 132378160076608 metric_logger.py:196] completed step: 7, seconds: 23.757, TFLOP/s/device: 0.572, Tokens/s/device: 86.207, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.005 I0422 13:26:44.916669 132378160076608 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.669, Tokens/s/device: 14420.200, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.388 I0422 13:26:45.052273 132378160076608 checkpointing.py:794] Waiting for step 9 to finish before checkpoint... I0422 13:26:45.055867 132378160076608 checkpointing.py:798] Waited 0.003611326217651367 seconds for step 9 to finish before starting checkpointing. I0422 13:26:45.058575 132378160076608 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0422 13:26:53.959568 132250081601280 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 45.224 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 34.928308725357056 s) (per-host) I0422 13:26:53.959713 132250081601280 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 34.598439s. I0422 13:27:02.817089 132250081601280 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 43.455797s. I0422 13:27:02.817399 132250054317824 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save. I0422 13:27:02.817527 132250054317824 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save. I0422 13:27:02.817580 132250054317824 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0422 13:27:02.819132 132250054317824 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0422 13:27:02.819327 132378160076608 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0. W0422 13:27:02.819479 132378160076608 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.760902 seconds. If this number is high, consider checkpointing less frequently. I0422 13:27:02.822167 132378160076608 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 9 I0422 13:27:02.824172 132378160076608 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints/9. I0422 13:27:03.435446 132378160076608 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array. I0422 13:27:03.435620 132378160076608 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0422 13:27:03.554879 132378160076608 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.120954s I0422 13:27:03.555048 132378160076608 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 9.999 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.15427827835083008 s) (per-host) I0422 13:27:03.555098 132378160076608 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.154336s (batch_requests_ready=0.017656s, total_serialization_initiated=0.136616s, others=0.000065s) I0422 13:27:03.555385 132378160076608 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.158647s (all_items=0.000017s, per_item={'items': '0.00001740'}, temp_paths=0.158630) I0422 13:27:03.556147 132378160076608 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.73 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints/9. I0422 13:27:03.556536 132250073208576 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 13:47:03.556496 I0422 13:27:03.566057 132378160076608 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0422 13:27:03.566293 132250054317824 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save. I0422 13:27:03.566431 132378160076608 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776864405.0585456, 'wait_for_prev_duration_secs': 17.76090168952942, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776864422.8222098, 'checkpointer_blocking_duration_secs': 0.7344865798950195, 'get_old_steps_start_time': 1776864423.5567229, 'get_old_steps_duration_secs': 3.123283386230469e-05, 'checkpoint_manager_blocking_start_time': 1776864405.0567644, 'checkpoint_manager_blocking_duration_secs': 18.5096333026886} I0422 13:27:03.566629 132378160076608 checkpointing.py:409] Started an asynchronous checkpoint save for step 9 I0422 13:27:03.566695 132378160076608 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0422 13:27:08.692164 132250623997696 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_5 I0422 13:27:46.778996 132250073208576 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 36.415 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.37818455696106 s) (per-host) I0422 13:27:46.779121 132250073208576 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 43.222470s. I0422 13:27:54.462221 132250073208576 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 50.905549s. I0422 13:27:54.462518 132250054317824 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save. I0422 13:27:54.462644 132250054317824 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save. I0422 13:27:54.462716 132250054317824 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0422 13:27:54.464570 132250054317824 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0422 13:27:54.464765 132378160076608 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0422 13:27:54.464916 132378160076608 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0422 13:27:54.465849 132378160076608 metric_logger.py:196] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.846, Tokens/s/device: 15049.860, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.367 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops XPK End: Wed Apr 22 13:28:03 UTC 2026 EXIT_CODE=0