XPK Start: Wed Apr 22 22:08:18 UTC 2026 PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. `rope_parameters`'s factor field must be a float >= 1, got 40 `rope_parameters`'s beta_fast field must be a float, got 32 `rope_parameters`'s beta_slow field must be a float, got 1 DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 2026-04-22 22:08:43.424585: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0422 22:08:43.636066 132790881523520 max_utils.py:273] Attempting to initialize the jax distributed system... I0422 22:08:52.678119 132790881523520 distributed.py:149] Starting JAX distributed service on [::]:8482 I0422 22:08:52.683667 132790881523520 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-gm79g-slice-job-0-0.mt-13-scan-layers-false-gm79g:8482 I0422 22:08:53.779313 132790881523520 max_utils.py:284] Jax distributed system initialized! I0422 22:08:59.202822 132790881523520 max_utils.py:800] System Information: Jax Version: 0.9.2 I0422 22:08:59.202928 132790881523520 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0422 22:08:59.202972 132790881523520 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365 I0422 22:08:59.203007 132790881523520 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing. I0422 22:08:59.894835 132790881523520 maxtext_utils.py:1565] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0422 22:08:59.895133 132790881523520 checkpointing.py:677] Setting up checkpoint logger... I0422 22:08:59.895194 132790881523520 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True I0422 22:08:59.895239 132790881523520 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None I0422 22:08:59.895571 132790881523520 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x78c511a51d00>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0422 22:09:02.734861 132790881523520 checkpointing.py:265] Enabling policy for fixed interval checkpointing. I0422 22:09:02.735163 132790881523520 checkpoint_manager.py:708] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78b1c4223620>}, handler_registry=None I0422 22:09:02.735401 132790881523520 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78b1c4223620>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0422 22:09:02.735450 132790881523520 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x78c3d87994f0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0422 22:09:02.735486 132790881523520 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78b1c4223620>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x78b1c4223620>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x78c3d87994f0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x78c3d87994f0>}). I0422 22:09:02.735867 132790881523520 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34 I0422 22:09:02.735938 132790881523520 async_checkpointer.py:192] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x78b1945732e0> timeout: 1200 secs and primary_host=0 for async checkpoint writes I0422 22:09:03.943409 132790881523520 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints I0422 22:09:04.341859 132790881523520 checkpoint_manager.py:929] [process=5][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x78b1c4225400> I0422 22:09:04.342025 132790881523520 checkpointing.py:301] Checkpoint manager created! I0422 22:09:05.291792 132790881523520 nnx_wrappers.py:453] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0422 22:09:05.291900 132790881523520 nnx_wrappers.py:453] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0422 22:09:05.672630 132790881523520 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0422 22:09:05.672733 132790881523520 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0422 22:09:05.689048 132790881523520 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0422 22:09:05.689112 132790881523520 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0422 22:09:05.719743 132790881523520 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0422 22:09:05.719812 132790881523520 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 22:09:05.736145 132790881523520 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0422 22:09:05.736209 132790881523520 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 22:09:05.752596 132790881523520 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0422 22:09:05.752669 132790881523520 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 22:09:05.782287 132790881523520 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0422 22:09:05.782361 132790881523520 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0422 22:09:05.804130 132790881523520 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0422 22:09:05.804195 132790881523520 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None). I0422 22:09:08.721453 132790881523520 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint I0422 22:09:08.721588 132790881523520 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint. fsdp: 32 I0422 22:09:15.961362 132790881523520 maxtext_utils.py:1668] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.961494 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.961549 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.961609 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.961663 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.961705 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.961760 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.961812 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.961852 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.961887 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.961921 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.961956 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.961991 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.962028 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962067 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962123 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.962163 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.962198 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.962231 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.962265 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.962298 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.962327 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.962357 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962386 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962417 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.962449 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.962479 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.962510 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.962540 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.962570 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.962600 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.962628 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962680 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962719 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.962752 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.962784 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.962816 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.962847 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.962878 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.962909 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.962940 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.962969 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963000 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963037 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.963068 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.963098 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963129 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963158 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963187 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.963216 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963245 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963275 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963306 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.963337 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.963369 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963400 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963430 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963461 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.963490 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963518 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963549 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963579 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.963609 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.963639 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963683 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963716 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963746 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.963774 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963802 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.963832 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963863 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.963893 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.963923 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.963953 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.963984 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964018 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.964048 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964076 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964107 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964137 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.964168 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.964199 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964228 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964260 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964291 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.964321 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964349 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964380 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964410 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.964439 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.964468 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964498 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964527 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964557 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.964585 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964614 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964644 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964685 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.964716 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.964746 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964776 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964805 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.964835 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.964863 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964890 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.964920 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.964950 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.964979 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.965013 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965043 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965072 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965102 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.965130 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965158 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965188 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965217 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.965247 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.965277 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965307 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965337 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965366 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.965394 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965422 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965452 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965482 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.965513 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.965543 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965573 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965603 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965633 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.965670 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965700 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965731 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965762 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.965793 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.965823 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.965853 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965883 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0422 22:09:15.965913 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0422 22:09:15.965941 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.965969 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0422 22:09:15.966000 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.966035 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0422 22:09:15.966066 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0422 22:09:15.966096 132790881523520 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0422 22:09:15.966145 132790881523520 maxtext_utils.py:1668] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: P('embed_vocab', 'vocab') Physical: ('fsdp', None) I0422 22:09:15.966192 132790881523520 maxtext_utils.py:1668] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: P('vocab', 'embed_vocab') Physical: (None, 'fsdp') I0422 22:09:20.538702 132790881523520 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0422 22:09:20.538794 132790881523520 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None). I0422 22:09:20.554232 132790881523520 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0422 22:09:20.554292 132790881523520 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None). I0422 22:10:17.316043 132790881523520 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB. I0422 22:10:17.317266 132790881523520 metric_logger.py:301] number parameters: 1.104 billion I0422 22:11:18.365842 132790881523520 checkpointing.py:772] Waiting for step 0 to finish before checkpoint... I0422 22:11:18.603176 132790881523520 checkpointing.py:776] Waited 0.2373189926147461 seconds for step 0 to finish before starting checkpointing. I0422 22:11:18.607006 132790881523520 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0422 22:11:18.609084 132790881523520 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 0 I0422 22:11:18.610846 132790881523520 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints/0. I0422 22:11:19.057053 132790881523520 signaling_client.py:364] Using JaxDistributedSignalingClient I0422 22:11:19.058082 132790881523520 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array. I0422 22:11:19.058208 132790881523520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0422 22:11:19.351408 132790881523520 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.294875s I0422 22:11:19.351574 132790881523520 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 4.700 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3281729221343994 s) (per-host) I0422 22:11:19.351624 132790881523520 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.328232s (batch_requests_ready=0.016859s, total_serialization_initiated=0.311306s, others=0.000067s) I0422 22:11:19.351974 132790881523520 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.332767s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=0.332750) I0422 22:11:19.352873 132790881523520 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.74 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints/0. I0422 22:11:19.353180 132666402096896 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 22:31:19.353149 I0422 22:11:19.390913 132790881523520 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0422 22:11:19.391252 132665902626560 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save. I0422 22:11:19.391405 132790881523520 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776895878.6069882, 'wait_for_prev_duration_secs': 6.031990051269531e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776895878.609167, 'checkpointer_blocking_duration_secs': 0.7441713809967041, 'get_old_steps_start_time': 1776895879.3533638, 'get_old_steps_duration_secs': 2.7179718017578125e-05, 'checkpoint_manager_blocking_start_time': 1776895878.6046364, 'checkpoint_manager_blocking_duration_secs': 0.7867317199707031} I0422 22:11:19.391603 132790881523520 checkpointing.py:408] Started an asynchronous checkpoint save for step 0 I0422 22:11:19.391669 132790881523520 max_utils.py:750] Memstats: After params initialized: I0422 22:11:19.391722 132790881523520 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_18(process=5,(2,4,0,0)) I0422 22:11:19.391755 132790881523520 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_19(process=5,(3,4,0,0)) I0422 22:11:19.391783 132790881523520 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_22(process=5,(2,5,0,0)) I0422 22:11:19.391817 132790881523520 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_23(process=5,(3,5,0,0)) I0422 22:11:19.778739 132790881523520 metric_logger.py:196] completed step: 0, seconds: 61.048, TFLOP/s/device: 0.223, Tokens/s/device: 33.547, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617 I0422 22:11:19.903951 132790881523520 metric_logger.py:196] completed step: 1, seconds: 1.404, TFLOP/s/device: 9.676, Tokens/s/device: 1458.457, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617 I0422 22:11:20.524675 132790881523520 metric_logger.py:196] completed step: 2, seconds: 0.014, TFLOP/s/device: 962.739, Tokens/s/device: 145114.434, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28794.738 I0422 22:11:20.660753 132790881523520 metric_logger.py:196] completed step: 3, seconds: 0.605, TFLOP/s/device: 22.441, Tokens/s/device: 3382.535, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16998.947 I0422 22:11:20.937418 132790881523520 metric_logger.py:196] completed step: 4, seconds: 0.142, TFLOP/s/device: 95.832, Tokens/s/device: 14444.813, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10779.095 I0422 22:11:20.948811 132790881523520 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.804, Tokens/s/device: 15043.559, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.565 I0422 22:11:22.597553 2750 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com I0422 22:11:24.697883 132666382157568 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_5 I0422 22:11:45.198433 132790881523520 metric_logger.py:196] completed step: 6, seconds: 0.278, TFLOP/s/device: 48.944, Tokens/s/device: 7377.389, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.800 I0422 22:11:45.334501 132790881523520 metric_logger.py:196] completed step: 7, seconds: 24.118, TFLOP/s/device: 0.563, Tokens/s/device: 84.915, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.222 I0422 22:11:45.470712 132790881523520 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.826, Tokens/s/device: 14443.998, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.698 I0422 22:11:45.606165 132790881523520 checkpointing.py:772] Waiting for step 9 to finish before checkpoint... I0422 22:11:45.609585 132790881523520 checkpointing.py:776] Waited 0.0034394264221191406 seconds for step 9 to finish before starting checkpointing. I0422 22:11:45.612488 132790881523520 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0422 22:11:52.621788 132666402096896 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 47.014 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.5983350276947 s) (per-host) I0422 22:11:52.621900 132666402096896 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.268594s. I0422 22:12:03.225643 132666402096896 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 43.872320s. I0422 22:12:03.225979 132665902626560 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save. I0422 22:12:03.226094 132665902626560 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save. I0422 22:12:03.226144 132665902626560 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0422 22:12:03.227557 132665902626560 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0422 22:12:03.227721 132790881523520 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0. W0422 22:12:03.227861 132790881523520 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.615377 seconds. If this number is high, consider checkpointing less frequently. I0422 22:12:03.230584 132790881523520 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 9 I0422 22:12:03.232700 132790881523520 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints/9. I0422 22:12:03.994447 132790881523520 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array. I0422 22:12:03.994616 132790881523520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0422 22:12:04.118481 132790881523520 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.125352s I0422 22:12:04.118640 132790881523520 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 9.788 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.1576061248779297 s) (per-host) I0422 22:12:04.118714 132790881523520 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.157687s (batch_requests_ready=0.016431s, total_serialization_initiated=0.141168s, others=0.000089s) I0422 22:12:04.119011 132790881523520 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.161924s (all_items=0.000015s, per_item={'items': '0.00001502'}, temp_paths=0.161909) I0422 22:12:04.119760 132790881523520 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.89 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints/9. I0422 22:12:04.120131 132666382157568 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 22:32:04.120090 I0422 22:12:04.130845 132790881523520 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0422 22:12:04.131124 132665902626560 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save. I0422 22:12:04.131282 132790881523520 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776895905.612456, 'wait_for_prev_duration_secs': 17.615376949310303, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776895923.2306232, 'checkpointer_blocking_duration_secs': 0.889655351638794, 'get_old_steps_start_time': 1776895924.1203017, 'get_old_steps_duration_secs': 2.8371810913085938e-05, 'checkpoint_manager_blocking_start_time': 1776895905.6104734, 'checkpoint_manager_blocking_duration_secs': 18.52077293395996} I0422 22:12:04.131486 132790881523520 checkpointing.py:408] Started an asynchronous checkpoint save for step 9 I0422 22:12:04.131532 132790881523520 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0422 22:12:11.063121 132666393704192 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_test_pipeline_scan_nnx_20260422_212603/linen_xpk_test_pipeline_scan_nnx_20260422_212603_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_5 I0422 22:12:46.506761 132666382157568 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 37.127 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.54568600654602 s) (per-host) I0422 22:12:46.506888 132666382157568 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.386644s. I0422 22:12:57.107393 132666382157568 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 52.987133s. I0422 22:12:57.107678 132665902626560 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save. I0422 22:12:57.107811 132665902626560 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save. I0422 22:12:57.107868 132665902626560 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0422 22:12:57.109489 132665902626560 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0422 22:12:57.109688 132790881523520 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0422 22:12:57.109839 132790881523520 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0422 22:12:57.110809 132790881523520 metric_logger.py:196] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.917, Tokens/s/device: 15060.595, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.762 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops XPK End: Wed Apr 22 22:13:09 UTC 2026 EXIT_CODE=0