feat/nnx-post-train-fixes| Metric | Linen d8cde296b | NNX d8cde296b | Diff (NNX − Linen) |
|---|---|---|---|
| Parameters | 1.104 billion | 1.104 billion | — |
| Final loss | 8.7100 | 8.6680 | -0.042 |
| TFLOP/s | 115.919 | 114.962 | -0.957 |
| Tok/s | 17472.6 | 17328.3 | -144.29 |
| Avg s/step | 2.851 | 2.590 | -0.261 |
| Memory % | 5.09 | 5.09 | 0 |
| JAX | 0.9.2 | 0.9.2 | — |
Diff = NNX value − Linen value. Green = NNX improved. Red = NNX regressed.
2026-04-16 18:32:32.696565: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0416 18:32:32.813620 128615849782400 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag. I0416 18:33:29.287592 128615849782400 max_utils.py:800] System Information: Jax Version: 0.9.2 I0416 18:33:29.287696 128615849782400 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0416 18:33:29.287730 128615849782400 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894 I0416 18:33:29.287754 128615849782400 train_utils.py:364] WARNING: 'dataset_path' might be pointing your local file system I0416 18:33:29.287832 128615849782400 train.py:811] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper. W0416 18:33:29.379647 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 18:33:29.783739 128615849782400 maxtext_utils.py:1687] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0416 18:33:29.783910 128615849782400 maxtext_utils.py:1687] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0416 18:33:29.784107 128615849782400 checkpointing.py:688] Setting up checkpoint logger... I0416 18:33:29.784149 128615849782400 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True I0416 18:33:29.784189 128615849782400 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None I0416 18:33:29.784634 128615849782400 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x74f91f5f5010>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0416 18:33:32.078688 128615849782400 checkpointing.py:266] Enabling policy for fixed interval checkpointing. I0416 18:33:32.078862 128615849782400 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x74f2a073d1c0>}, handler_registry=None I0416 18:33:32.079120 128615849782400 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x74f2a073d1c0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0416 18:33:32.079162 128615849782400 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x74f2a073e030>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0416 18:33:32.079190 128615849782400 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x74f2a073d1c0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x74f2a073d1c0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x74f2a073e030>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x74f2a073e030>}). I0416 18:33:32.079471 128615849782400 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28 I0416 18:33:32.079518 128615849782400 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x74f2a072dee0> timeout: 600 secs and primary_host=0 for async checkpoint writes I0416 18:33:32.203527 128615849782400 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints I0416 18:33:32.203750 128615849782400 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x74f2a073c4d0> I0416 18:33:32.203827 128615849782400 checkpointing.py:302] Checkpoint manager created! I0416 18:33:32.282640 128615849782400 dataset_info.py:707] Load dataset info from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 18:33:32.285649 128615849782400 reader.py:262] Creating a tf.data.Dataset reading 8 files located in folders: tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0. I0416 18:33:32.338658 128615849782400 logging_logger.py:49] Constructing tf.data.Dataset __local_c4_builder for split train, from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 18:33:32.366619 128615849782400 tokenizer.py:245] Tokenizer path: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 18:33:32.366682 128615849782400 tokenizer.py:187] Loading sentencepiece tokenizer: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 18:33:32.905754 128615849782400 dataset_info.py:707] Load dataset info from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 18:33:32.907323 128615849782400 reader.py:262] Creating a tf.data.Dataset reading 2 files located in folders: tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0. I0416 18:33:32.923310 128615849782400 logging_logger.py:49] Constructing tf.data.Dataset __local_c4_builder for split validation, from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 18:33:32.925868 128615849782400 tokenizer.py:245] Tokenizer path: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 18:33:32.925917 128615849782400 tokenizer.py:187] Loading sentencepiece tokenizer: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 18:33:33.625771 128615849782400 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0416 18:33:33.625883 128615849782400 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 18:33:33.727611 128615849782400 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0416 18:33:33.727688 128615849782400 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 18:33:33.742826 128615849782400 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0416 18:33:33.742873 128615849782400 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 18:33:33.764641 128615849782400 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0416 18:33:33.764696 128615849782400 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 18:33:33.779818 128615849782400 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0416 18:33:33.779868 128615849782400 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 18:33:33.794999 128615849782400 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0416 18:33:33.795063 128615849782400 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 18:33:33.817439 128615849782400 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0416 18:33:33.817495 128615849782400 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 18:33:33.837367 128615849782400 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0416 18:33:33.837418 128615849782400 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None). I0416 18:33:34.027925 128615849782400 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint I0416 18:33:34.028017 128615849782400 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint. W0416 18:33:34.079176 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. [DECOUPLED NO-OP] gcs_storage: using stubs. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] workload_monitor: using stub. [DECOUPLED NO-OP] vertex_tensorboard: using stub. fsdp: 8 I0416 18:33:34.471915 128615849782400 maxtext_utils.py:1796] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0416 18:33:34.472023 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/mlp/wi_0/kernel Shape: float32[2048,16,7168] Logical: P('embed', 'layers', 'mlp') Physical: ('fsdp', None, None) I0416 18:33:34.472084 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/mlp/wi_1/kernel Shape: float32[2048,16,7168] Logical: P('embed', 'layers', 'mlp') Physical: ('fsdp', None, None) I0416 18:33:34.472131 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/mlp/wo/kernel Shape: float32[7168,16,2048] Logical: P('mlp', 'layers', 'embed') Physical: (None, None, 'fsdp') I0416 18:33:34.472172 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/post_self_attention_layer_norm/scale Shape: float32[2048,16] Logical: P('norm', 'layers') Physical: (None, None) I0416 18:33:34.472200 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/pre_self_attention_layer_norm/scale Shape: float32[2048,16] Logical: P('norm', 'layers') Physical: (None, None) I0416 18:33:34.472239 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/self_attention/key/kernel Shape: float32[2048,16,16,128] Logical: P('embed', 'layers', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None, None) I0416 18:33:34.472282 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/self_attention/out/kernel Shape: float32[16,16,128,2048] Logical: P('heads', 'layers', 'kv', 'embed') Physical: (None, None, None, 'fsdp') I0416 18:33:34.472311 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/self_attention/query/kernel Shape: float32[2048,16,16,128] Logical: P('embed', 'layers', 'q_heads', 'kv') Physical: ('fsdp', None, None, None) I0416 18:33:34.472337 128615849782400 maxtext_utils.py:1796] params/params/decoder/layers/self_attention/value/kernel Shape: float32[2048,16,16,128] Logical: P('embed', 'layers', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None, None) I0416 18:33:34.472371 128615849782400 maxtext_utils.py:1796] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: P('embed_vocab', 'vocab') Physical: ('fsdp', None) I0416 18:33:34.472404 128615849782400 maxtext_utils.py:1796] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: P('vocab', 'embed_vocab') Physical: (None, 'fsdp') I0416 18:33:34.904930 128615849782400 train.py:157] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length'). I0416 18:33:34.905010 128615849782400 train.py:157] train/xent Physical: float32[8,2048]............................................. ('fsdp', None). I0416 18:33:34.919519 128615849782400 train.py:164] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length'). I0416 18:33:34.919571 128615849782400 train.py:164] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None). W0416 18:33:35.214278 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 18:33:35.336103 128615849782400 max_utils.py:791] Total memory size: 3.6 GB, Output size: 1.5 GB, Temp size: 2.0 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB. I0416 18:33:35.336566 128615849782400 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter. I0416 18:33:35.336706 128615849782400 metric_logger.py:289] number parameters: 1.104 billion W0416 18:33:38.398984 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 18:33:38.535609 128615849782400 checkpointing.py:794] Waiting for step 0 to finish before checkpoint... I0416 18:33:38.656076 128615849782400 checkpointing.py:798] Waited 0.12045049667358398 seconds for step 0 to finish before starting checkpointing. I0416 18:33:38.656486 128615849782400 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0416 18:33:38.656642 128615849782400 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0 I0416 18:33:38.657015 128615849782400 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0. I0416 18:33:38.741978 128615849782400 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient I0416 18:33:38.828549 128500732266048 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0 I0416 18:33:38.835808 128615849782400 jax_array_handlers.py:347] Scheduling D2H of 39 prioritized jax.Array. I0416 18:33:38.835895 128615849782400 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False W0416 18:33:39.051300 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.062976 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.083621 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.097491 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.103607 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.108774 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.113593 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.118281 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 18:33:39.597229 128500721780288 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items I0416 18:33:39.642990 128615849782400 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.807572s I0416 18:33:39.643526 128615849782400 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 13.705 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 900 milliseconds) (per-host) I0416 18:33:39.643668 128615849782400 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.900754s (batch_requests_ready=0.089181s, total_serialization_initiated=0.811135s, others=0.000438s) I0416 18:33:39.643784 128615849782400 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.901335s (all_items=0.000019s, per_item={'items': '0.00001907'}, temp_paths=0.901316) I0416 18:33:39.644594 128500656768576 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0416 18:33:39.644686 128615849782400 async_checkpointer.py:561] Finished blocking save. Time taken: 0.988005s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0. I0416 18:33:39.644885 128615849782400 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0416 18:33:39.645132 128615849782400 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776364418.6564736, 'wait_for_prev_duration_secs': 4.076957702636719e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776364418.6566632, 'checkpointer_blocking_duration_secs': 0.9881381988525391, 'get_old_steps_start_time': 1776364419.64482, 'get_old_steps_duration_secs': 2.86102294921875e-05, 'checkpoint_manager_blocking_start_time': 1776364418.656393, 'checkpoint_manager_blocking_duration_secs': 0.9887127876281738} I0416 18:33:39.645206 128615849782400 checkpointing.py:409] Started an asynchronous checkpoint save for step 0 I0416 18:33:39.645247 128615849782400 max_utils.py:750] Memstats: After params initialized: I0416 18:33:39.645355 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_0(process=0,(0,0,0,0)) I0416 18:33:39.645385 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_1(process=0,(1,0,0,0)) I0416 18:33:39.645408 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_2(process=0,(0,1,0,0)) I0416 18:33:39.645427 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_3(process=0,(1,1,0,0)) I0416 18:33:39.645445 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_4(process=0,(0,2,0,0)) I0416 18:33:39.645464 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_5(process=0,(1,2,0,0)) I0416 18:33:39.645482 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_6(process=0,(0,3,0,0)) I0416 18:33:39.645499 128615849782400 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_7(process=0,(1,3,0,0)) W0416 18:33:39.651670 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.657185 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 18:33:39.662041 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 18:33:39.745607 128500677740096 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776364419510409791, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776364419510409791, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/_CHECKPOINT_METADATA I0416 18:33:39.745847 128500742751808 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0416 18:33:39.974287 128615849782400 metric_logger.py:185] completed step: 0, seconds: 3.198, TFLOP/s/device: 4.248, Tokens/s/device: 640.316, total_weights: 13328, loss: 10.828 I0416 18:33:39.975205 128615849782400 metric_logger.py:269] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/tensorboard/' I0416 18:33:40.090405 128615849782400 metric_logger.py:185] completed step: 1, seconds: 1.437, TFLOP/s/device: 9.457, Tokens/s/device: 1425.432, total_weights: 12332, loss: 10.828 I0416 18:33:40.207598 128615849782400 metric_logger.py:185] completed step: 2, seconds: 0.009, TFLOP/s/device: 1487.696, Tokens/s/device: 224241.761, total_weights: 15161, loss: 9.823 I0416 18:33:40.265828 3990246 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json I0416 18:33:40.265892 3990246 google_auth_provider.cc:156] Using OAuth2 AuthProvider I0416 18:33:41.190558 128500711294528 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 39 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items/array_metadatas/process_0 W0416 18:33:44.433996 3987254 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 18:33:44.515646 128615849782400 train.py:728] Completed eval step 0 I0416 18:33:44.557520 128615849782400 train.py:728] Completed eval step 1 I0416 18:33:44.561998 128615849782400 metric_logger.py:204] eval metrics after step: 4, loss=8.979, total_weights=26703.0 I0416 18:33:44.562373 128615849782400 metric_logger.py:185] completed step: 3, seconds: 0.117, TFLOP/s/device: 115.990, Tokens/s/device: 17483.204, total_weights: 13327, loss: 9.405 I0416 18:34:03.948472 128615849782400 metric_logger.py:185] completed step: 4, seconds: 0.117, TFLOP/s/device: 115.776, Tokens/s/device: 17451.026, total_weights: 11939, loss: 9.045 I0416 18:34:03.956564 128615849782400 metric_logger.py:185] completed step: 5, seconds: 4.354, TFLOP/s/device: 3.120, Tokens/s/device: 470.337, total_weights: 15502, loss: 8.953 I0416 18:34:04.073166 128615849782400 metric_logger.py:185] completed step: 6, seconds: 19.385, TFLOP/s/device: 0.701, Tokens/s/device: 105.648, total_weights: 13864, loss: 8.775 I0416 18:34:04.190637 128615849782400 metric_logger.py:185] completed step: 7, seconds: 0.006, TFLOP/s/device: 2353.157, Tokens/s/device: 354693.453, total_weights: 12988, loss: 8.688 I0416 18:34:04.845851 128615849782400 train.py:728] Completed eval step 0 I0416 18:34:04.886163 128615849782400 train.py:728] Completed eval step 1 I0416 18:34:04.888761 128615849782400 metric_logger.py:204] eval metrics after step: 9, loss=8.653, total_weights=26703.0 I0416 18:34:04.889026 128615849782400 metric_logger.py:185] completed step: 8, seconds: 0.118, TFLOP/s/device: 115.457, Tokens/s/device: 17402.979, total_weights: 13820, loss: 8.740 I0416 18:34:04.890788 128615849782400 checkpointing.py:794] Waiting for step 9 to finish before checkpoint... I0416 18:34:04.891373 128615849782400 checkpointing.py:798] Waited 0.0005877017974853516 seconds for step 9 to finish before starting checkpointing. I0416 18:34:04.891625 128615849782400 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0416 18:34:15.798077 128500732266048 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 35.538332s (commit=35.061142s, array_metadata_write=0.477189s) I0416 18:34:15.799249 128500656768576 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 341.016 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 37 seconds) (per-host) I0416 18:34:15.799371 128500656768576 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 36.154636s. I0416 18:34:16.061167 128500656768576 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776364419510409791, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/_CHECKPOINT_METADATA I0416 18:34:16.260198 128500656768576 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0416 18:34:16.432683 128500677740096 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776364419510409791, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/_CHECKPOINT_METADATA I0416 18:34:16.674590 128500656768576 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0416 18:34:16.674768 128500656768576 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.567773s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items I0416 18:34:16.675945 128500656768576 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items I0416 18:34:16.911489 128500656768576 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0 I0416 18:34:17.589146 128500656768576 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0`. I0416 18:34:17.589756 128500656768576 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 38.933081s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0 I0416 18:34:17.589832 128500656768576 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 37.945101s. I0416 18:34:17.590008 128500742751808 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0416 18:34:17.590126 128500742751808 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0416 18:34:17.590183 128500742751808 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0416 18:34:17.590228 128500742751808 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0416 18:34:17.590350 128615849782400 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0. W0416 18:34:17.590474 128615849782400 checkpoint_manager.py:1441] Waiting for previous save to complete took 12.698858 seconds. If this number is high, consider checkpointing less frequently. I0416 18:34:17.591185 128615849782400 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9 I0416 18:34:17.591479 128615849782400 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9. I0416 18:34:17.776051 128500742751808 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9 I0416 18:34:17.784487 128615849782400 jax_array_handlers.py:347] Scheduling D2H of 39 prioritized jax.Array. I0416 18:34:17.784578 128615849782400 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0416 18:34:18.490861 128500700808768 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items I0416 18:34:28.589692 128615849782400 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 10.805614s I0416 18:34:28.590250 128615849782400 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.132 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 10 seconds) (per-host) I0416 18:34:28.590305 128615849782400 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 10.899607s (batch_requests_ready=0.090601s, total_serialization_initiated=10.808572s, others=0.000434s) I0416 18:34:28.590374 128615849782400 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 10.900103s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=10.900086) I0416 18:34:28.596877 128500742751808 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0416 18:34:28.596977 128615849782400 async_checkpointer.py:561] Finished blocking save. Time taken: 11.005740s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9. I0416 18:34:28.597231 128615849782400 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0416 18:34:28.597381 128500656768576 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0416 18:34:28.597456 128615849782400 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776364444.8915818, 'wait_for_prev_duration_secs': 12.69885802268982, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776364457.5912116, 'checkpointer_blocking_duration_secs': 11.005919456481934, 'get_old_steps_start_time': 1776364468.5971525, 'get_old_steps_duration_secs': 3.147125244140625e-05, 'checkpoint_manager_blocking_start_time': 1776364444.891544, 'checkpoint_manager_blocking_duration_secs': 23.70588493347168} I0416 18:34:28.597599 128615849782400 checkpointing.py:409] Started an asynchronous checkpoint save for step 9 I0416 18:34:28.597634 128615849782400 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0416 18:34:29.330972 128500732266048 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 39 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items/array_metadatas/process_0 I0416 18:35:07.643529 128498658182720 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 39.047475s (commit=38.631435s, array_metadata_write=0.416040s) I0416 18:35:07.644544 128500742751808 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 252.969 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 49 seconds) (per-host) I0416 18:35:07.644667 128500742751808 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 39.047636s. I0416 18:35:08.062304 128500742751808 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0416 18:35:08.498570 128500742751808 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0416 18:35:08.498739 128500742751808 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.577456s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items I0416 18:35:08.500063 128500742751808 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items I0416 18:35:08.736160 128500742751808 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9 I0416 18:35:09.421313 128500742751808 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9`. I0416 18:35:09.421983 128500742751808 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 51.830759s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_post_train_fixes_20260416_181521/linen_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9 I0416 18:35:09.422072 128500742751808 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 40.825042s. I0416 18:35:09.422184 128500656768576 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0416 18:35:09.422233 128500656768576 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0416 18:35:09.422280 128500656768576 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0416 18:35:09.422322 128500656768576 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0416 18:35:09.422650 128615849782400 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0416 18:35:09.423079 128615849782400 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0416 18:35:09.423650 128615849782400 metric_logger.py:185] completed step: 9, seconds: 0.117, TFLOP/s/device: 115.919, Tokens/s/device: 17472.614, total_weights: 12300, loss: 8.710 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops
2026-04-16 19:55:47.077370: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0416 19:55:47.195406 139133427473536 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag. ~/maxtext_venv/lib/python3.12/site-packages/jax/_src/xla_bridge.py:219: UserWarning: TPU backend initialization is taking more than 60.0 seconds. Did you run your code on all TPU hosts? See https://docs.jax.dev/en/latest/multi_process.html for more information. warnings.warn( I0416 19:56:47.354380 139133427473536 max_utils.py:800] System Information: Jax Version: 0.9.2 I0416 19:56:47.354493 139133427473536 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0416 19:56:47.354529 139133427473536 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894 I0416 19:56:47.354555 139133427473536 train_utils.py:364] WARNING: 'dataset_path' might be pointing your local file system I0416 19:56:47.354647 139133427473536 train.py:811] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper. W0416 19:56:47.451508 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:56:47.849983 139133427473536 maxtext_utils.py:1687] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0416 19:56:47.948780 139133427473536 checkpointing.py:688] Setting up checkpoint logger... I0416 19:56:47.948878 139133427473536 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True I0416 19:56:47.948944 139133427473536 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None I0416 19:56:47.949175 139133427473536 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7e89ef9d3290>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0416 19:56:50.282847 139133427473536 checkpointing.py:266] Enabling policy for fixed interval checkpointing. I0416 19:56:50.283268 139133427473536 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e836ff2f920>}, handler_registry=None I0416 19:56:50.283784 139133427473536 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e836ff2f920>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0416 19:56:50.283839 139133427473536 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e8371388350>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0416 19:56:50.283873 139133427473536 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e836ff2f920>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e836ff2f920>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e8371388350>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e8371388350>}). I0416 19:56:50.284574 139133427473536 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28 I0416 19:56:50.284651 139133427473536 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x7e8370f91800> timeout: 600 secs and primary_host=0 for async checkpoint writes I0416 19:56:50.397505 139133427473536 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints I0416 19:56:50.397779 139133427473536 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7e836ff6d730> I0416 19:56:50.397879 139133427473536 checkpointing.py:302] Checkpoint manager created! I0416 19:56:50.481748 139133427473536 dataset_info.py:707] Load dataset info from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 19:56:50.484956 139133427473536 reader.py:262] Creating a tf.data.Dataset reading 8 files located in folders: tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0. I0416 19:56:50.542630 139133427473536 logging_logger.py:49] Constructing tf.data.Dataset __local_c4_builder for split train, from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 19:56:50.575898 139133427473536 tokenizer.py:245] Tokenizer path: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 19:56:50.575973 139133427473536 tokenizer.py:187] Loading sentencepiece tokenizer: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 19:56:51.555468 139133427473536 dataset_info.py:707] Load dataset info from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 19:56:51.557025 139133427473536 reader.py:262] Creating a tf.data.Dataset reading 2 files located in folders: tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0. I0416 19:56:51.572969 139133427473536 logging_logger.py:49] Constructing tf.data.Dataset __local_c4_builder for split validation, from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0 I0416 19:56:51.575560 139133427473536 tokenizer.py:245] Tokenizer path: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 19:56:51.575608 139133427473536 tokenizer.py:187] Loading sentencepiece tokenizer: src/maxtext/assets/tokenizers/tokenizer.llama2 I0416 19:56:51.914500 139133427473536 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint I0416 19:56:51.914605 139133427473536 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint. W0416 19:56:52.025068 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. [DECOUPLED NO-OP] gcs_storage: using stubs. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] workload_monitor: using stub. [DECOUPLED NO-OP] vertex_tensorboard: using stub. fsdp: 8 I0416 19:56:52.120379 139133427473536 maxtext_utils.py:1805] decoder/decoder_norm/scale/value Shape: float32[2048] Physical: (None,) I0416 19:56:52.120477 139133427473536 maxtext_utils.py:1805] decoder/layers/mlp/wi_0/kernel/value Shape: float32[2048,16,7168] Physical: ('fsdp', None, None) I0416 19:56:52.120516 139133427473536 maxtext_utils.py:1805] decoder/layers/mlp/wi_1/kernel/value Shape: float32[2048,16,7168] Physical: ('fsdp', None, None) I0416 19:56:52.120548 139133427473536 maxtext_utils.py:1805] decoder/layers/mlp/wo/kernel/value Shape: float32[7168,16,2048] Physical: (None, None, 'fsdp') I0416 19:56:52.120574 139133427473536 maxtext_utils.py:1805] decoder/layers/post_self_attention_layer_norm/scale/value Shape: float32[2048,16] Physical: (None, None) I0416 19:56:52.120599 139133427473536 maxtext_utils.py:1805] decoder/layers/pre_self_attention_layer_norm/scale/value Shape: float32[2048,16] Physical: (None, None) I0416 19:56:52.120625 139133427473536 maxtext_utils.py:1805] decoder/layers/self_attention/key/kernel/value Shape: float32[2048,16,16,128] Physical: ('fsdp', None, None, None) I0416 19:56:52.120650 139133427473536 maxtext_utils.py:1805] decoder/layers/self_attention/out/kernel/value Shape: float32[16,16,128,2048] Physical: (None, None, None, 'fsdp') I0416 19:56:52.120674 139133427473536 maxtext_utils.py:1805] decoder/layers/self_attention/query/kernel/value Shape: float32[2048,16,16,128] Physical: ('fsdp', None, None, None) I0416 19:56:52.120705 139133427473536 maxtext_utils.py:1805] decoder/layers/self_attention/value/kernel/value Shape: float32[2048,16,16,128] Physical: ('fsdp', None, None, None) I0416 19:56:52.120728 139133427473536 maxtext_utils.py:1805] decoder/logits_dense/kernel/value Shape: float32[2048,32000] Physical: ('fsdp', None) I0416 19:56:52.120751 139133427473536 maxtext_utils.py:1805] token_embedder/embedding/value Shape: float32[32000,2048] Physical: (None, 'fsdp') I0416 19:56:52.238851 139133427473536 nnx_decoders.py:465] nnx_decoders/carry Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0416 19:56:52.238931 139133427473536 nnx_decoders.py:465] nnx_decoders/carry Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 19:56:52.243872 139133427473536 nnx_decoders.py:465] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0416 19:56:52.243919 139133427473536 nnx_decoders.py:465] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 19:56:52.259095 139133427473536 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0416 19:56:52.259144 139133427473536 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 19:56:52.273744 139133427473536 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0416 19:56:52.273792 139133427473536 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0416 19:56:52.295888 139133427473536 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0416 19:56:52.295946 139133427473536 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 19:56:52.310658 139133427473536 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0416 19:56:52.310712 139133427473536 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 19:56:52.325320 139133427473536 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0416 19:56:52.325381 139133427473536 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 19:56:52.353111 139133427473536 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0416 19:56:52.353173 139133427473536 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0416 19:56:52.371429 139133427473536 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0416 19:56:52.371481 139133427473536 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None). W0416 19:56:52.728822 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:56:52.850400 139133427473536 max_utils.py:791] Total memory size: 3.6 GB, Output size: 1.5 GB, Temp size: 2.0 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB. I0416 19:56:52.850927 139133427473536 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter. I0416 19:56:52.852269 139133427473536 metric_logger.py:289] number parameters: 1.104 billion W0416 19:56:55.715787 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:56:55.852177 139133427473536 checkpointing.py:794] Waiting for step 0 to finish before checkpoint... I0416 19:56:55.971518 139133427473536 checkpointing.py:798] Waited 0.11933302879333496 seconds for step 0 to finish before starting checkpointing. I0416 19:56:55.971958 139133427473536 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0416 19:56:55.972125 139133427473536 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0 I0416 19:56:55.972492 139133427473536 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0. I0416 19:56:56.061988 139133427473536 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient I0416 19:56:56.166023 139018398336576 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0 I0416 19:56:56.166524 139133427473536 jax_array_handlers.py:347] Scheduling D2H of 69 prioritized jax.Array. I0416 19:56:56.166696 139133427473536 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False W0416 19:56:56.811835 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:56.822717 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:56:56.834664 139018387850816 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items W0416 19:56:56.837001 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:56.851222 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:56.861062 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:56.866461 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:56.870724 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:56.874702 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:56:57.012280 139018343810624 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776369416742399600, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776369416742399600, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/_CHECKPOINT_METADATA I0416 19:56:57.367612 4071958 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json I0416 19:56:57.367666 4071958 google_auth_provider.cc:156] Using OAuth2 AuthProvider I0416 19:56:58.184375 139133427473536 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 2.019416s I0416 19:56:58.191298 139133427473536 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 5.799 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 2 seconds) (per-host) I0416 19:56:58.191361 139133427473536 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 2.128168s (batch_requests_ready=0.093814s, total_serialization_initiated=2.027525s, others=0.006828s) I0416 19:56:58.191440 139133427473536 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 2.128733s (all_items=0.000020s, per_item={'items': '0.00002003'}, temp_paths=2.128713) I0416 19:56:58.192708 139018408822336 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0416 19:56:58.192787 139133427473536 async_checkpointer.py:561] Finished blocking save. Time taken: 2.220622s. Continuing background save to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0. I0416 19:56:58.192981 139133427473536 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0416 19:56:58.193176 139018280896064 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0416 19:56:58.193278 139133427473536 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776369415.9719448, 'wait_for_prev_duration_secs': 4.076957702636719e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776369415.9721477, 'checkpointer_blocking_duration_secs': 2.2207627296447754, 'get_old_steps_start_time': 1776369418.1929288, 'get_old_steps_duration_secs': 1.9788742065429688e-05, 'checkpoint_manager_blocking_start_time': 1776369415.9718573, 'checkpoint_manager_blocking_duration_secs': 2.22139310836792} I0416 19:56:58.193391 139133427473536 checkpointing.py:409] Started an asynchronous checkpoint save for step 0 I0416 19:56:58.193445 139133427473536 max_utils.py:750] Memstats: After params initialized: I0416 19:56:58.193488 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_0(process=0,(0,0,0,0)) I0416 19:56:58.193511 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_1(process=0,(1,0,0,0)) I0416 19:56:58.193530 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_2(process=0,(0,1,0,0)) I0416 19:56:58.193549 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_3(process=0,(1,1,0,0)) I0416 19:56:58.193566 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_4(process=0,(0,2,0,0)) I0416 19:56:58.193583 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_5(process=0,(1,2,0,0)) I0416 19:56:58.193599 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_6(process=0,(0,3,0,0)) I0416 19:56:58.193615 139133427473536 max_utils.py:756] Using (GB) 1.59 / 31.25 (5.088000%) on TPU_7(process=0,(1,3,0,0)) W0416 19:56:58.199349 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:58.205010 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0416 19:56:58.209301 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:56:58.534966 139133427473536 metric_logger.py:185] completed step: 0, seconds: 2.999, TFLOP/s/device: 4.531, Tokens/s/device: 682.983, total_weights: 13328, loss: 10.880 I0416 19:56:58.535842 139133427473536 metric_logger.py:269] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/tensorboard/' I0416 19:56:58.647737 139133427473536 metric_logger.py:185] completed step: 1, seconds: 2.679, TFLOP/s/device: 5.071, Tokens/s/device: 764.417, total_weights: 12332, loss: 10.862 I0416 19:56:58.764749 139133427473536 metric_logger.py:185] completed step: 2, seconds: 0.020, TFLOP/s/device: 668.230, Tokens/s/device: 100722.963, total_weights: 15161, loss: 9.926 I0416 19:56:59.417756 139018366879296 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 69 array_metadata.ArrayMetadata to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items/array_metadatas/process_0 W0416 19:57:01.203667 4069074 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0416 19:57:01.297525 139133427473536 train.py:728] Completed eval step 0 I0416 19:57:01.341310 139133427473536 train.py:728] Completed eval step 1 I0416 19:57:01.346410 139133427473536 metric_logger.py:204] eval metrics after step: 4, loss=8.926, total_weights=26703.0 I0416 19:57:01.346784 139133427473536 metric_logger.py:185] completed step: 3, seconds: 0.109, TFLOP/s/device: 125.076, Tokens/s/device: 18852.814, total_weights: 13327, loss: 9.396 I0416 19:57:18.914357 139133427473536 metric_logger.py:185] completed step: 4, seconds: 0.114, TFLOP/s/device: 118.956, Tokens/s/device: 17930.310, total_weights: 11939, loss: 9.005 I0416 19:57:18.924470 139133427473536 metric_logger.py:185] completed step: 5, seconds: 2.583, TFLOP/s/device: 5.261, Tokens/s/device: 792.923, total_weights: 15502, loss: 8.861 I0416 19:57:19.040016 139133427473536 metric_logger.py:185] completed step: 6, seconds: 17.566, TFLOP/s/device: 0.773, Tokens/s/device: 116.587, total_weights: 13864, loss: 8.738 I0416 19:57:19.157607 139133427473536 metric_logger.py:185] completed step: 7, seconds: 0.008, TFLOP/s/device: 1669.180, Tokens/s/device: 251597.052, total_weights: 12988, loss: 8.651 I0416 19:57:19.821313 139133427473536 train.py:728] Completed eval step 0 I0416 19:57:19.859614 139133427473536 train.py:728] Completed eval step 1 I0416 19:57:19.863302 139133427473536 metric_logger.py:204] eval metrics after step: 9, loss=8.623, total_weights=26703.0 I0416 19:57:19.863664 139133427473536 metric_logger.py:185] completed step: 8, seconds: 0.115, TFLOP/s/device: 118.013, Tokens/s/device: 17788.278, total_weights: 13820, loss: 8.671 I0416 19:57:19.866727 139133427473536 checkpointing.py:794] Waiting for step 9 to finish before checkpoint... I0416 19:57:19.867744 139133427473536 checkpointing.py:798] Waited 0.001016855239868164 seconds for step 9 to finish before starting checkpointing. I0416 19:57:19.867983 139133427473536 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0416 19:57:36.162470 139018301867584 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 37.970455s (commit=37.488491s, array_metadata_write=0.481965s) I0416 19:57:36.163528 139018408822336 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 315.129 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 40 seconds) (per-host) I0416 19:57:36.163583 139018408822336 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 37.970738s. I0416 19:57:36.387968 139018408822336 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776369416742399600, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/_CHECKPOINT_METADATA I0416 19:57:36.580711 139018408822336 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0416 19:57:36.783366 139018343810624 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776369416742399600, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/_CHECKPOINT_METADATA I0416 19:57:36.982501 139018408822336 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0416 19:57:36.982684 139018408822336 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.555198s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items I0416 19:57:36.983322 139018408822336 atomicity.py:608] Finalizing gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0/items I0416 19:57:37.220961 139018408822336 atomicity.py:608] Finalizing gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0 I0416 19:57:37.877753 139018408822336 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0`. I0416 19:57:37.878403 139018408822336 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 41.906244s. directory=gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/0 I0416 19:57:37.878475 139018408822336 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 39.685630s. I0416 19:57:37.878653 139018280896064 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0416 19:57:37.878765 139018280896064 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0416 19:57:37.878821 139018280896064 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0416 19:57:37.878867 139018280896064 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0416 19:57:37.878989 139133427473536 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0. W0416 19:57:37.879122 139133427473536 checkpoint_manager.py:1441] Waiting for previous save to complete took 18.011137 seconds. If this number is high, consider checkpointing less frequently. I0416 19:57:37.879912 139133427473536 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9 I0416 19:57:37.880207 139133427473536 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9. I0416 19:57:38.064038 139133427473536 jax_array_handlers.py:347] Scheduling D2H of 69 prioritized jax.Array. I0416 19:57:38.064184 139018280896064 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9 I0416 19:57:38.064264 139133427473536 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0416 19:57:38.818607 139018270410304 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items I0416 19:57:41.411375 139133427473536 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 3.348619s I0416 19:57:41.417551 139133427473536 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 3.577 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 3 seconds) (per-host) I0416 19:57:41.417633 139133427473536 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 3.449856s (batch_requests_ready=0.087799s, total_serialization_initiated=3.355963s, others=0.006094s) I0416 19:57:41.417714 139133427473536 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 3.450388s (all_items=0.000020s, per_item={'items': '0.00002027'}, temp_paths=3.450368) I0416 19:57:41.419142 139018322839104 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0416 19:57:41.419237 139133427473536 async_checkpointer.py:561] Finished blocking save. Time taken: 3.539281s. Continuing background save to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9. I0416 19:57:41.419429 139133427473536 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0416 19:57:41.419607 139018291381824 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0416 19:57:41.419705 139133427473536 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776369439.8679554, 'wait_for_prev_duration_secs': 18.011136770248413, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776369457.879936, 'checkpointer_blocking_duration_secs': 3.5394070148468018, 'get_old_steps_start_time': 1776369461.4193647, 'get_old_steps_duration_secs': 2.6464462280273438e-05, 'checkpoint_manager_blocking_start_time': 1776369439.8679204, 'checkpoint_manager_blocking_duration_secs': 21.551758527755737} I0416 19:57:41.419827 139133427473536 checkpointing.py:409] Started an asynchronous checkpoint save for step 9 I0416 19:57:41.419861 139133427473536 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0416 19:57:42.193437 139018366879296 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 69 array_metadata.ArrayMetadata to gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items/array_metadatas/process_0 I0416 19:58:16.502358 139018280896064 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 35.084016s (commit=34.610010s, array_metadata_write=0.474006s) I0416 19:58:16.503417 139018322839104 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 327.925 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 38 seconds) (per-host) I0416 19:58:16.503472 139018322839104 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 35.084192s. I0416 19:58:16.949188 139018322839104 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0416 19:58:17.365663 139018322839104 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0416 19:58:17.365828 139018322839104 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.561953s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items I0416 19:58:17.366521 139018322839104 atomicity.py:608] Finalizing gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9/items I0416 19:58:17.619911 139018322839104 atomicity.py:608] Finalizing gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9 I0416 19:58:18.327630 139018322839104 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9`. I0416 19:58:18.328285 139018322839104 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 40.448336s. directory=gs://wanglance-maxtext/nnx_ckpt_feat_nnx_post_train_fixes_20260416_181521/nnx_feat_nnx_post_train_fixes_20260416_181521_07_eval/checkpoints/9 I0416 19:58:18.328359 139018322839104 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 36.909079s. I0416 19:58:18.328528 139018291381824 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0416 19:58:18.328624 139018291381824 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0416 19:58:18.328685 139018291381824 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0416 19:58:18.328732 139018291381824 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0416 19:58:18.329053 139133427473536 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0416 19:58:18.329219 139133427473536 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0416 19:58:18.329799 139133427473536 metric_logger.py:185] completed step: 9, seconds: 0.118, TFLOP/s/device: 114.962, Tokens/s/device: 17328.324, total_weights: 12300, loss: 8.668 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops