XPK Start: Tue Apr 21 12:24:04 UTC 2026 PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2026-04-21 12:24:28.760434: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0421 12:24:28.940807 134742061287232 max_utils.py:273] Attempting to initialize the jax distributed system... I0421 12:24:37.980867 134742061287232 distributed.py:149] Starting JAX distributed service on [::]:8482 I0421 12:24:37.983193 134742061287232 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-v9f66-slice-job-0-0.mt-13-scan-layers-false-v9f66:8482 I0421 12:24:39.348132 134742061287232 max_utils.py:284] Jax distributed system initialized! I0421 12:24:45.362954 134742061287232 max_utils.py:800] System Information: Jax Version: 0.9.2 I0421 12:24:45.363054 134742061287232 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0421 12:24:45.363092 134742061287232 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365 I0421 12:24:45.363126 134742061287232 train_utils.py:378] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing. I0421 12:24:46.408772 134742061287232 maxtext_utils.py:1718] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0421 12:24:46.409340 134742061287232 maxtext_utils.py:1718] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0421 12:24:46.409514 134742061287232 checkpointing.py:688] Setting up checkpoint logger... I0421 12:24:46.409564 134742061287232 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True I0421 12:24:46.409610 134742061287232 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None I0421 12:24:46.409952 134742061287232 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7a8b76b1bd70>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0421 12:24:49.132792 134742061287232 checkpointing.py:266] Enabling policy for fixed interval checkpointing. I0421 12:24:49.133023 134742061287232 checkpoint_manager.py:708] [process=4][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7a8a50a35160>}, handler_registry=None I0421 12:24:49.133259 134742061287232 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7a8a50a35160>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0421 12:24:49.133309 134742061287232 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7a8a50c06630>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0421 12:24:49.133347 134742061287232 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7a8a50a35160>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7a8a50a35160>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7a8a50c06630>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7a8a50c06630>}). I0421 12:24:49.133697 134742061287232 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34 I0421 12:24:49.133770 134742061287232 async_checkpointer.py:192] [process=4][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7a76e47e79c0> timeout: 1200 secs and primary_host=0 for async checkpoint writes I0421 12:24:50.134267 134742061287232 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints I0421 12:24:50.160255 134742061287232 checkpoint_manager.py:929] [process=4][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7a8a50a34fe0> I0421 12:24:50.160381 134742061287232 checkpointing.py:302] Checkpoint manager created! I0421 12:24:51.096558 134742061287232 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0421 12:24:51.096684 134742061287232 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0421 12:24:51.482032 134742061287232 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0421 12:24:51.482122 134742061287232 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0421 12:24:51.498370 134742061287232 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0421 12:24:51.498427 134742061287232 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0421 12:24:51.533667 134742061287232 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0421 12:24:51.533767 134742061287232 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0421 12:24:51.550624 134742061287232 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0421 12:24:51.550695 134742061287232 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0421 12:24:51.567080 134742061287232 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0421 12:24:51.567143 134742061287232 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0421 12:24:51.596514 134742061287232 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0421 12:24:51.596583 134742061287232 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0421 12:24:51.618662 134742061287232 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0421 12:24:51.618725 134742061287232 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None). I0421 12:24:54.533001 134742061287232 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint I0421 12:24:54.533119 134742061287232 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint. fsdp: 32 I0421 12:25:01.746954 134742061287232 maxtext_utils.py:1827] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747086 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.747140 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.747197 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.747237 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747279 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747329 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.747380 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.747420 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.747454 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_0/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.747489 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.747524 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.747555 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.747585 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747614 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747659 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.747695 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.747728 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.747760 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_1/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.747791 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.747821 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.747850 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.747879 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747907 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.747938 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.747968 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.747999 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.748030 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_10/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748059 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748089 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748119 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.748147 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.748174 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.748202 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748249 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.748306 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.748345 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_11/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748377 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748408 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748438 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.748466 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.748494 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.748524 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748554 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.748584 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.748613 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_12/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748653 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748689 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748720 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.748748 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.748775 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.748804 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748834 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.748862 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.748891 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_13/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.748920 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748949 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.748977 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.749004 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749030 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749059 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749088 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.749117 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.749146 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_14/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749175 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749203 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749232 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.749263 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749291 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749320 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749348 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.749377 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.749406 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_15/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749435 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749464 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749493 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.749521 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749548 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749578 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749609 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.749648 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.749681 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_2/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749711 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749740 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749768 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.749795 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749822 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.749851 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749880 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.749908 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.749936 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_3/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.749965 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.749994 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750021 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.750048 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750075 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750103 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750132 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.750160 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.750189 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_4/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750216 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750250 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750279 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.750306 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750333 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750361 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750390 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.750418 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.750448 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_5/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750477 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750506 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750534 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.750561 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750588 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750617 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750658 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.750689 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.750719 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_6/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750748 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750777 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.750806 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.750833 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750860 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.750889 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.750917 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.750946 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.750974 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_7/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.751003 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.751031 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.751059 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.751086 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.751113 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.751142 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.751171 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.751200 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.751229 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_8/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.751261 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.751289 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0421 12:25:01.751318 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0421 12:25:01.751345 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.751372 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0421 12:25:01.751402 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.751431 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0421 12:25:01.751460 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0421 12:25:01.751490 134742061287232 maxtext_utils.py:1827] params/params/decoder/layers_9/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0421 12:25:01.751536 134742061287232 maxtext_utils.py:1827] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: P('embed_vocab', 'vocab') Physical: ('fsdp', None) I0421 12:25:01.751581 134742061287232 maxtext_utils.py:1827] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: P('vocab', 'embed_vocab') Physical: (None, 'fsdp') I0421 12:25:06.288271 134742061287232 train.py:157] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0421 12:25:06.288365 134742061287232 train.py:157] train/xent Physical: float32[32,2048]............................................ ('fsdp', None). I0421 12:25:06.304075 134742061287232 train.py:164] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0421 12:25:06.304136 134742061287232 train.py:164] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None). I0421 12:26:02.777276 134742061287232 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB. I0421 12:26:02.778516 134742061287232 metric_logger.py:301] number parameters: 1.104 billion I0421 12:27:03.483049 134742061287232 checkpointing.py:794] Waiting for step 0 to finish before checkpoint... I0421 12:27:03.727002 134742061287232 checkpointing.py:798] Waited 0.24393367767333984 seconds for step 0 to finish before starting checkpointing. I0421 12:27:03.730866 134742061287232 checkpoint_manager.py:2009] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0421 12:27:03.732861 134742061287232 checkpoint_manager.py:1512] [process=4] Saving checkpoint at step 0 I0421 12:27:03.734307 134742061287232 event_tracking.py:70] [process=4] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints/0. I0421 12:27:04.497916 134742061287232 signaling_client.py:364] Using JaxDistributedSignalingClient I0421 12:27:04.498891 134742061287232 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array. I0421 12:27:04.499014 134742061287232 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0421 12:27:04.795128 134742061287232 base_pytree_checkpoint_handler.py:154] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.297748s I0421 12:27:04.795289 134742061287232 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/blocking_gbytes_per_sec: 4.656 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3313148021697998 s) (per-host) I0421 12:27:04.795346 134742061287232 base_pytree_checkpoint_handler.py:768] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.331380s (batch_requests_ready=0.016816s, total_serialization_initiated=0.314492s, others=0.000072s) I0421 12:27:04.795607 134742061287232 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.335541s (all_items=0.000018s, per_item={'items': '0.00001764'}, temp_paths=0.335523) I0421 12:27:04.796451 134742061287232 event_tracking.py:125] [process=4] [async] Finished blocking save in 1.06 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints/0. I0421 12:27:04.797472 134612044404480 async_checkpointer.py:76] [process=4][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-21 12:47:04.797450 I0421 12:27:04.847507 134742061287232 checkpoint_manager.py:1560] [process=4][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0421 12:27:04.847877 134612027619072 async_checkpointer.py:280] [process=4][thread=save_finalize] Waiting for background save thread=async_save. I0421 12:27:04.848043 134742061287232 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776774423.7308483, 'wait_for_prev_duration_secs': 5.984306335449219e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776774423.7328992, 'checkpointer_blocking_duration_secs': 1.0646858215332031, 'get_old_steps_start_time': 1776774424.7976103, 'get_old_steps_duration_secs': 4.100799560546875e-05, 'checkpoint_manager_blocking_start_time': 1776774423.7284756, 'checkpoint_manager_blocking_duration_secs': 1.1195247173309326} I0421 12:27:04.848251 134742061287232 checkpointing.py:409] Started an asynchronous checkpoint save for step 0 I0421 12:27:04.848306 134742061287232 max_utils.py:750] Memstats: After params initialized: I0421 12:27:04.848360 134742061287232 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_16(process=4,(0,4,0,0)) I0421 12:27:04.848393 134742061287232 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_17(process=4,(1,4,0,0)) I0421 12:27:04.848421 134742061287232 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_20(process=4,(0,5,0,0)) I0421 12:27:04.848445 134742061287232 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_21(process=4,(1,5,0,0)) I0421 12:27:05.199149 134742061287232 metric_logger.py:196] completed step: 0, seconds: 60.704, TFLOP/s/device: 0.224, Tokens/s/device: 33.737, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617 I0421 12:27:05.359258 134742061287232 metric_logger.py:196] completed step: 1, seconds: 1.707, TFLOP/s/device: 7.962, Tokens/s/device: 1200.107, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617 I0421 12:27:05.783764 134742061287232 metric_logger.py:196] completed step: 2, seconds: 0.034, TFLOP/s/device: 402.355, Tokens/s/device: 60647.339, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28792.854 I0421 12:27:05.920216 134742061287232 metric_logger.py:196] completed step: 3, seconds: 0.424, TFLOP/s/device: 32.012, Tokens/s/device: 4825.227, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 17001.285 I0421 12:27:06.196597 134742061287232 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.147, Tokens/s/device: 14341.536, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10778.796 I0421 12:27:06.207671 134742061287232 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.743, Tokens/s/device: 15034.282, total_weights: 65536, loss: 8.900, lm_loss: 8.900, perplexity: 7335.420 I0421 12:27:07.720653 2916 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com I0421 12:27:10.240420 134615797348096 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_4 I0421 12:27:30.562119 134742061287232 metric_logger.py:196] completed step: 6, seconds: 0.277, TFLOP/s/device: 48.997, Tokens/s/device: 7385.370, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.121 I0421 12:27:30.698309 134742061287232 metric_logger.py:196] completed step: 7, seconds: 24.235, TFLOP/s/device: 0.561, Tokens/s/device: 84.507, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4417.746 I0421 12:27:30.834410 134742061287232 metric_logger.py:196] completed step: 8, seconds: 0.130, TFLOP/s/device: 104.276, Tokens/s/device: 15717.575, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3881.893 I0421 12:27:30.970272 134742061287232 checkpointing.py:794] Waiting for step 9 to finish before checkpoint... I0421 12:27:30.973840 134742061287232 checkpointing.py:798] Waited 0.003587484359741211 seconds for step 9 to finish before starting checkpointing. I0421 12:27:30.976873 134742061287232 checkpoint_manager.py:2020] [process=4][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0421 12:27:37.450272 134612044404480 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/gbytes_per_sec: 47.887 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 32.986246824264526 s) (per-host) I0421 12:27:37.450381 134612044404480 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 32.652833s. I0421 12:27:47.197978 134612044404480 async_checkpointer.py:160] [process=4][thread=async_save] Background save thread done. Time taken: 42.400414s. I0421 12:27:47.198294 134612027619072 async_checkpointer.py:288] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save. I0421 12:27:47.198420 134612027619072 async_checkpointer.py:298] [process=4][thread=save_finalize] No errors found in background save thread=async_save. I0421 12:27:47.198474 134612027619072 checkpoint_manager.py:2137] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0421 12:27:47.200707 134612027619072 checkpoint_manager.py:2146] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0421 12:27:47.200902 134742061287232 checkpoint_manager.py:2032] [process=4][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0. W0421 12:27:47.201065 134742061287232 checkpoint_manager.py:1452] Waiting for previous save to complete took 16.224190 seconds. If this number is high, consider checkpointing less frequently. I0421 12:27:47.203682 134742061287232 checkpoint_manager.py:1512] [process=4] Saving checkpoint at step 9 I0421 12:27:47.205861 134742061287232 event_tracking.py:70] [process=4] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints/9. I0421 12:27:47.515524 134742061287232 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array. I0421 12:27:47.515707 134742061287232 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0421 12:27:47.639835 134742061287232 base_pytree_checkpoint_handler.py:154] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.125875s I0421 12:27:47.640016 134742061287232 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/blocking_gbytes_per_sec: 9.773 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.15784764289855957 s) (per-host) I0421 12:27:47.640065 134742061287232 base_pytree_checkpoint_handler.py:768] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.157907s (batch_requests_ready=0.016138s, total_serialization_initiated=0.141702s, others=0.000067s) I0421 12:27:47.640371 134742061287232 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.162152s (all_items=0.000015s, per_item={'items': '0.00001454'}, temp_paths=0.162138) I0421 12:27:47.641143 134742061287232 event_tracking.py:125] [process=4] [async] Finished blocking save in 0.44 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints/9. I0421 12:27:47.641523 134612027619072 async_checkpointer.py:76] [process=4][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-21 12:47:47.641485 I0421 12:27:47.654712 134742061287232 checkpoint_manager.py:1560] [process=4][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0421 12:27:47.655022 134611522811648 async_checkpointer.py:280] [process=4][thread=save_finalize] Waiting for background save thread=async_save. I0421 12:27:47.655185 134742061287232 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776774450.9768426, 'wait_for_prev_duration_secs': 16.22418975830078, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776774467.2037225, 'checkpointer_blocking_duration_secs': 0.43796348571777344, 'get_old_steps_start_time': 1776774467.641709, 'get_old_steps_duration_secs': 3.123283386230469e-05, 'checkpoint_manager_blocking_start_time': 1776774450.974734, 'checkpoint_manager_blocking_duration_secs': 16.68041706085205} I0421 12:27:47.655396 134742061287232 checkpointing.py:409] Started an asynchronous checkpoint save for step 9 I0421 12:27:47.655445 134742061287232 checkpoint_manager.py:2020] [process=4][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0421 12:27:53.613857 134615797348096 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260421_114057/linen_xpk_feat_nnx_post_train_fixes_20260421_114057_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_4 I0421 12:28:30.519262 134612027619072 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/gbytes_per_sec: 36.703 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.03704571723938 s) (per-host) I0421 12:28:30.519395 134612027619072 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.877752s. I0421 12:28:39.688785 134612027619072 async_checkpointer.py:160] [process=4][thread=async_save] Background save thread done. Time taken: 52.047127s. I0421 12:28:39.689085 134611522811648 async_checkpointer.py:288] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save. I0421 12:28:39.689218 134611522811648 async_checkpointer.py:298] [process=4][thread=save_finalize] No errors found in background save thread=async_save. I0421 12:28:39.689270 134611522811648 checkpoint_manager.py:2137] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0421 12:28:39.691879 134611522811648 checkpoint_manager.py:2146] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0421 12:28:39.692065 134742061287232 checkpoint_manager.py:2032] [process=4][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0421 12:28:39.692228 134742061287232 checkpoint_manager.py:2009] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0421 12:28:39.693126 134742061287232 metric_logger.py:196] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.832, Tokens/s/device: 15047.759, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.472 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops XPK End: Tue Apr 21 12:28:50 UTC 2026 EXIT_CODE=0