MaxView

‹ 04_int8Case: 05_fp806_grad_accum ›

Metrics: main (7f7822822) vs feat/nnx-trainstate-and-training-loop (8bb919bd5)

Metricmain  7f7822822feat/nnx-trainstate-and-training-loop  8bb919bd5Diff (feat/nnx-trainstate-and-training-loop − main)
Parameters1.105 billion1.105 billion
Final loss10.844010.8510+0.007
TFLOP/s0.828123.159+122.331
Tok/s124.818563.8+18439.046
Avg s/step1.854
Memory %5.155.150
JAX0.9.20.9.2

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  7f7822822  ·  main_20260417_125240  ·  full log
2026-04-17 13:04:01.724324: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0417 13:04:01.844229 137949689728128 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag.
I0417 13:04:44.590944 137949689728128 max_utils.py:800] System Information: Jax Version: 0.9.2
I0417 13:04:44.591077 137949689728128 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0417 13:04:44.591114 137949689728128 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894
I0417 13:04:44.591145 137949689728128 train_utils.py:335] WARNING: 'dataset_path' might be pointing your local file system
I0417 13:04:44.591236 137949689728128 train.py:671] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper.
W0417 13:04:44.685891  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0417 13:04:45.077914 137949689728128 maxtext_utils.py:1548] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0417 13:04:45.078303 137949689728128 checkpointing.py:677] Setting up checkpoint logger...
I0417 13:04:45.078351 137949689728128 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0417 13:04:45.078415 137949689728128 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0417 13:04:45.079079 137949689728128 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7d76535f8bc0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0417 13:04:47.504977 137949689728128 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0417 13:04:47.505208 137949689728128 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6fd7bfd220>}, handler_registry=None
I0417 13:04:47.505518 137949689728128 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6fd7bfd220>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0417 13:04:47.505561 137949689728128 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d6fd6f2e030>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0417 13:04:47.505591 137949689728128 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6fd7bfd220>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6fd7bfd220>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d6fd6f2e030>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d6fd6f2e030>}).
I0417 13:04:47.505875 137949689728128 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
I0417 13:04:47.505936 137949689728128 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x7d6fd6f31e40> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0417 13:04:47.635559 137949689728128 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints
I0417 13:04:47.635798 137949689728128 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7d6fd6f2df70>
I0417 13:04:47.635883 137949689728128 checkpointing.py:301] Checkpoint manager created!
I0417 13:04:48.087684 137949689728128 dataset_info.py:707] Load dataset info from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0
I0417 13:04:48.090544 137949689728128 reader.py:262] Creating a tf.data.Dataset reading 8 files located in folders: tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0.
I0417 13:04:48.143513 137949689728128 logging_logger.py:49] Constructing tf.data.Dataset __local_c4_builder for split train, from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0
I0417 13:04:48.166559 137949689728128 tokenizer.py:245] Tokenizer path: src/maxtext/assets/tokenizers/tokenizer.llama2
I0417 13:04:48.166619 137949689728128 tokenizer.py:187] Loading sentencepiece tokenizer: src/maxtext/assets/tokenizers/tokenizer.llama2
I0417 13:04:48.912539 137949689728128 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0417 13:04:48.912657 137949689728128 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0417 13:04:49.017779 137949689728128 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0417 13:04:49.017857 137949689728128 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0417 13:04:49.033075 137949689728128 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0417 13:04:49.033124 137949689728128 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0417 13:04:49.079120 137949689728128 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0417 13:04:49.079186 137949689728128 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0417 13:04:49.094353 137949689728128 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0417 13:04:49.094403 137949689728128 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0417 13:04:49.109549 137949689728128 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0417 13:04:49.109599 137949689728128 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0417 13:04:49.132033 137949689728128 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0417 13:04:49.132096 137949689728128 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0417 13:04:49.177238 137949689728128 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0417 13:04:49.177301 137949689728128 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None).
I0417 13:04:49.578009 137949689728128 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0417 13:04:49.578174 137949689728128 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
W0417 13:04:49.644319  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
[DECOUPLED NO-OP] gcs_storage: using stubs.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] workload_monitor: using stub.
[DECOUPLED NO-OP] vertex_tensorboard: using stub.
fsdp: 8
I0417 13:04:50.773173 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773273 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773311 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773340 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773365 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773390 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773413 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773435 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773456 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773477 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773500 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773522 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773542 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773563 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773585 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773607 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773626 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773646 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773666 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773686 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773705 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773726 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773747 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773767 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773788 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773808 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773827 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773847 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773866 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773890 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773912 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773931 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773951 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773970 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.773990 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774010 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774030 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774062 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774082 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774101 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774121 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774140 137949689728128 maxtext_utils.py:1651]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0417 13:04:50.774180 137949689728128 maxtext_utils.py:1651]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0417 13:04:50.774220 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/mlp/wi_0/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0417 13:04:50.774247 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/mlp/wi_1/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0417 13:04:50.774280 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/mlp/wo/kernel
    Shape:     float32[7168,16,2048]
    Logical:   P('mlp', 'layers', 'embed')
    Physical:  (None, None, 'fsdp')
I0417 13:04:50.774312 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/post_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0417 13:04:50.774335 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/pre_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0417 13:04:50.774370 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/self_attention/key/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0417 13:04:50.774405 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/self_attention/out/kernel
    Shape:     float32[16,16,128,2048]
    Logical:   P('heads', 'layers', 'kv', 'embed')
    Physical:  (None, None, None, 'fsdp')
I0417 13:04:50.774430 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/self_attention/query/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None, None)
I0417 13:04:50.774453 137949689728128 maxtext_utils.py:1651]  params/params/decoder/layers/self_attention/value/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0417 13:04:50.774486 137949689728128 maxtext_utils.py:1651]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0417 13:04:50.774518 137949689728128 maxtext_utils.py:1651]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0417 13:04:51.772411 137949689728128 train.py:155] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0417 13:04:51.772491 137949689728128 train.py:155] train/xent Physical: float32[8,2048]............................................. ('fsdp', None).
I0417 13:04:51.786832 137949689728128 train.py:162] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0417 13:04:51.786886 137949689728128 train.py:162] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None).
I0417 13:05:03.954467 137949689728128 max_utils.py:791] Total memory size: 3.5 GB, Output size: 1.5 GB, Temp size: 1.9 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB.
I0417 13:05:03.955191 137949689728128 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter.
I0417 13:05:03.955764 137949689728128 metric_logger.py:301] number parameters: 1.105 billion
I0417 13:05:20.367801 137949689728128 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0417 13:05:20.506394 137949689728128 checkpointing.py:776] Waited 0.13858437538146973 seconds for step 0 to finish before starting checkpointing.
I0417 13:05:20.506918 137949689728128 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0417 13:05:20.507102 137949689728128 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0
I0417 13:05:20.507460 137949689728128 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints/0.
I0417 13:05:20.595613 137949689728128 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient
I0417 13:05:20.699116 137949689728128 jax_array_handlers.py:347] Scheduling D2H of 81 prioritized jax.Array.
I0417 13:05:20.699252 137834669278784 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints/0
I0417 13:05:20.699634 137949689728128 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0417 13:05:21.403807 137834658793024 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints/0/items
I0417 13:05:21.997745 137834618947136 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776431121312285375, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776431121312285375, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints/0/_CHECKPOINT_METADATA
W0417 13:05:22.001000  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.021917  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.029246  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.042993  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.050273  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.055378  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.060374  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.065256  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0417 13:05:22.461725  428716 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json
I0417 13:05:22.461779  428716 google_auth_provider.cc:156] Using OAuth2 AuthProvider
I0417 13:05:22.745181 137949689728128 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 2.046805s
I0417 13:05:22.752203 137949689728128 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 5.725 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 2 seconds) (per-host)
I0417 13:05:22.752270 137949689728128 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 2.155728s (batch_requests_ready=0.094702s, total_serialization_initiated=2.054100s, others=0.006927s)
I0417 13:05:22.752359 137949689728128 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 2.156271s (all_items=0.000019s, per_item={'items': '0.00001931'}, temp_paths=2.156252)
I0417 13:05:22.753630 137834545546816 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0417 13:05:22.753715 137949689728128 async_checkpointer.py:561] Finished blocking save. Time taken: 2.246571s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints/0.
I0417 13:05:22.753892 137949689728128 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0417 13:05:22.754059 137834524575296 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0417 13:05:22.754137 137949689728128 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776431120.5069034, 'wait_for_prev_duration_secs': 4.3392181396484375e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776431120.507125, 'checkpointer_blocking_duration_secs': 2.2466959953308105, 'get_old_steps_start_time': 1776431122.7538388, 'get_old_steps_duration_secs': 2.0503997802734375e-05, 'checkpoint_manager_blocking_start_time': 1776431120.506816, 'checkpoint_manager_blocking_duration_secs': 2.2472939491271973}
I0417 13:05:22.754229 137949689728128 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0417 13:05:22.754267 137949689728128 max_utils.py:750] 
Memstats: After params initialized:
I0417 13:05:22.754308 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_0(process=0,(0,0,0,0))
I0417 13:05:22.754332 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_1(process=0,(1,0,0,0))
I0417 13:05:22.754352 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_2(process=0,(0,1,0,0))
I0417 13:05:22.754371 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_3(process=0,(1,1,0,0))
I0417 13:05:22.754390 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_4(process=0,(0,2,0,0))
I0417 13:05:22.754407 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_5(process=0,(1,2,0,0))
I0417 13:05:22.754425 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_6(process=0,(0,3,0,0))
I0417 13:05:22.754442 137949689728128 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_7(process=0,(1,3,0,0))
W0417 13:05:22.760585  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.766670  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0417 13:05:22.772020  425448 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0417 13:05:23.108998 137949689728128 metric_logger.py:196] completed step: 0, seconds: 16.411, TFLOP/s/device: 0.828, Tokens/s/device: 124.794, total_weights: 13328, loss: 10.844, lm_loss: 10.844, perplexity: 51212.484
I0417 13:05:23.109886 137949689728128 metric_logger.py:281] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/tensorboard/'
I0417 13:05:23.215590 137949689728128 metric_logger.py:196] completed step: 1, seconds: 2.738, TFLOP/s/device: 4.963, Tokens/s/device: 748.119, total_weights: 12332, loss: nan, lm_loss: nan, perplexity: nan
I0417 13:05:23.216675 137949689728128 metric_logger.py:237] Aborting training due to NaN loss.
I0417 13:05:23.216784 137949689728128 metric_logger.py:196] completed step: 1, seconds: 2.738, TFLOP/s/device: 4.963, Tokens/s/device: 748.119, total_weights: 12332, loss: nan, lm_loss: nan, perplexity: nan
I0417 13:05:23.216897 137949689728128 metric_logger.py:237] Aborting training due to NaN loss.
E0417 13:05:23.401748 137834577004096 future.py:315] [process=0][thread=array_type_handler][operation_id=1] _SignalingThread.run() raised an exception: cannot schedule new futures after shutdown
Traceback (most recent call last):
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 312, in run
    super().run()
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/threading.py", line 1012, in run
    self._target(*self._args, **self._kwargs)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 257, in _target_setting_result
    self._result = target()
                   ^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 391, in <lambda>
    target=lambda: asyncio_utils.run_sync(coro),
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/asyncio_utils.py", line 36, in run_sync
    return asyncio.run(coro)
           ^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/serialization/jax_array_handlers.py", line 363, in _serialize_without_dispatcher
    await async_serialize_replica_slices_batch(
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/serialization/jax_array_handlers.py", line 608, in _async_serialize_replica_slices
    await asyncio.gather(*write_coros)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/metadata/array_metadata_store.py", line 195, in write
    if await async_path.exists(file_path):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/path/async_path.py", line 67, in exists
    return await asyncio.to_thread(path.exists)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 867, in run_in_executor
    executor.submit(func, *args), loop=self)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/concurrent/futures/thread.py", line 171, in submit
    raise RuntimeError('cannot schedule new futures after shutdown')
RuntimeError: cannot schedule new futures after shutdown
E0417 13:05:23.404865 137834566518336 future.py:315] [process=0][thread=write_metadata_after_commits][operation_id=1] _SignalingThread.run() raised an exception: cannot schedule new futures after shutdown
Traceback (most recent call last):
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 312, in run
    super().run()
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/threading.py", line 1012, in run
    self._target(*self._args, **self._kwargs)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 257, in _target_setting_result
    self._result = target()
                   ^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 391, in <lambda>
    target=lambda: asyncio_utils.run_sync(coro),
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/asyncio_utils.py", line 36, in run_sync
    return asyncio.run(coro)
           ^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/handlers/base_pytree_checkpoint_handler.py", line 1192, in _write_metadata_after_commits
    await asyncio.to_thread(commit_future.result)
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/concurrent/futures/thread.py", line 59, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 449, in result
    return self._f.result(timeout=timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 398, in result
    return self._t.result(timeout=timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 347, in result
    self.join(timeout=timeout)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 344, in join
    raise self._exception
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 312, in run
    super().run()
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/threading.py", line 1012, in run
    self._target(*self._args, **self._kwargs)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 257, in _target_setting_result
    self._result = target()
                   ^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 391, in <lambda>
    target=lambda: asyncio_utils.run_sync(coro),
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/asyncio_utils.py", line 36, in run_sync
    return asyncio.run(coro)
           ^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/serialization/jax_array_handlers.py", line 363, in _serialize_without_dispatcher
    await async_serialize_replica_slices_batch(
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/serialization/jax_array_handlers.py", line 608, in _async_serialize_replica_slices
    await asyncio.gather(*write_coros)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/metadata/array_metadata_store.py", line 195, in write
    if await async_path.exists(file_path):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/path/async_path.py", line 67, in exists
    return await asyncio.to_thread(path.exists)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 867, in run_in_executor
    executor.submit(func, *args), loop=self)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/concurrent/futures/thread.py", line 171, in submit
    raise RuntimeError('cannot schedule new futures after shutdown')
RuntimeError: cannot schedule new futures after shutdown
E0417 13:05:23.406694 137834545546816 async_checkpointer.py:229] [process=0] Failed to run 3 Handler Commit operations or the Commit callback in background save thread, directory: gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_05_fp8/checkpoints/0
Traceback (most recent call last):
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/checkpointers/async_checkpointer.py", line 215, in _thread_func
    _background_wait_for_commit_futures(
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/checkpointers/async_checkpointer.py", line 88, in _background_wait_for_commit_futures
    commit_future.result()
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 196, in result
    f.result(timeout=time_remaining)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 449, in result
    return self._f.result(timeout=timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 398, in result
    return self._t.result(timeout=timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 347, in result
    self.join(timeout=timeout)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 344, in join
    raise self._exception
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 312, in run
    super().run()
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/threading.py", line 1012, in run
    self._target(*self._args, **self._kwargs)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 257, in _target_setting_result
    self._result = target()
                   ^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 391, in <lambda>
    target=lambda: asyncio_utils.run_sync(coro),
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/asyncio_utils.py", line 36, in run_sync
    return asyncio.run(coro)
           ^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/handlers/base_pytree_checkpoint_handler.py", line 1192, in _write_metadata_after_commits
    await asyncio.to_thread(commit_future.result)
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/concurrent/futures/thread.py", line 59, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 449, in result
    return self._f.result(timeout=timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 398, in result
    return self._t.result(timeout=timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 347, in result
    self.join(timeout=timeout)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 344, in join
    raise self._exception
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 312, in run
    super().run()
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/threading.py", line 1012, in run
    self._target(*self._args, **self._kwargs)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 257, in _target_setting_result
    self._result = target()
                   ^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/futures/future.py", line 391, in <lambda>
    target=lambda: asyncio_utils.run_sync(coro),
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/asyncio_utils.py", line 36, in run_sync
    return asyncio.run(coro)
           ^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/serialization/jax_array_handlers.py", line 363, in _serialize_without_dispatcher
    await async_serialize_replica_slices_batch(
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/serialization/jax_array_handlers.py", line 608, in _async_serialize_replica_slices
    await asyncio.gather(*write_coros)
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/metadata/array_metadata_store.py", line 195, in write
    if await async_path.exists(file_path):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/maxtext_venv/lib/python3.12/site-packages/orbax/checkpoint/_src/path/async_path.py", line 67, in exists
    return await asyncio.to_thread(path.exists)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 867, in run_in_executor
    executor.submit(func, *args), loop=self)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/.local/share/uv/python/cpython-3.12.12-linux-x86_64-gnu/lib/python3.12/concurrent/futures/thread.py", line 171, in submit
    raise RuntimeError('cannot schedule new futures after shutdown')
RuntimeError: cannot schedule new futures after shutdown
I0417 13:05:23.408425 137834524575296 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0417 13:05:23.408502 137834524575296 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0417 13:05:23.408585 137834524575296 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
2026-04-20 14:34:27.924995: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0420 14:34:28.047421 135337422638208 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag.
I0420 14:34:52.405161 135337422638208 max_utils.py:800] System Information: Jax Version: 0.9.2
I0420 14:34:52.405278 135337422638208 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0420 14:34:52.405313 135337422638208 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894
I0420 14:34:52.405339 135337422638208 train_utils.py:365] WARNING: 'dataset_path' might be pointing your local file system
I0420 14:34:52.405441 135337422638208 train.py:806] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper.
I0420 14:34:52.920324 135337422638208 maxtext_utils.py:1718] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0420 14:34:52.920500 135337422638208 maxtext_utils.py:1718] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0420 14:34:52.920692 135337422638208 checkpointing.py:688] Setting up checkpoint logger...
I0420 14:34:52.920730 135337422638208 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0420 14:34:52.920769 135337422638208 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0420 14:34:52.921371 135337422638208 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7b161c1f8c80>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0420 14:34:55.275435 135337422638208 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0420 14:34:55.275734 135337422638208 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b0f9f7600e0>}, handler_registry=None
I0420 14:34:55.276077 135337422638208 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b0f9f7600e0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0420 14:34:55.276123 135337422638208 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7b0f9fb0a330>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0420 14:34:55.276153 135337422638208 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b0f9f7600e0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b0f9f7600e0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7b0f9fb0a330>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7b0f9fb0a330>}).
I0420 14:34:55.276516 135337422638208 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
I0420 14:34:55.276582 135337422638208 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x7b0f9f74efc0> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0420 14:34:55.403360 135337422638208 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints
I0420 14:34:55.403616 135337422638208 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7b0f9f737710>
I0420 14:34:55.403705 135337422638208 checkpointing.py:302] Checkpoint manager created!
I0420 14:34:55.858330 135337422638208 dataset_info.py:707] Load dataset info from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0
I0420 14:34:55.861494 135337422638208 reader.py:262] Creating a tf.data.Dataset reading 8 files located in folders: tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0.
I0420 14:34:55.914095 135337422638208 logging_logger.py:49] Constructing tf.data.Dataset __local_c4_builder for split train, from tests/assets/local_datasets/c4_en_dataset_minimal/c4/en/3.1.0
I0420 14:34:55.938750 135337422638208 tokenizer.py:245] Tokenizer path: src/maxtext/assets/tokenizers/tokenizer.llama2
I0420 14:34:55.938817 135337422638208 tokenizer.py:187] Loading sentencepiece tokenizer: src/maxtext/assets/tokenizers/tokenizer.llama2
I0420 14:34:56.696400 135337422638208 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0420 14:34:56.696597 135337422638208 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0420 14:34:56.805407 135337422638208 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0420 14:34:56.805492 135337422638208 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0420 14:34:56.820486 135337422638208 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0420 14:34:56.820538 135337422638208 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0420 14:34:56.866960 135337422638208 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 14:34:56.867039 135337422638208 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0420 14:34:56.882139 135337422638208 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 14:34:56.882191 135337422638208 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0420 14:34:56.897175 135337422638208 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0420 14:34:56.897229 135337422638208 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0420 14:34:56.920273 135337422638208 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0420 14:34:56.920344 135337422638208 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0420 14:34:56.989315 135337422638208 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0420 14:34:56.989479 135337422638208 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None).
I0420 14:34:57.378207 135337422638208 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0420 14:34:57.378325 135337422638208 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
[DECOUPLED NO-OP] gcs_storage: using stubs.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] workload_monitor: using stub.
[DECOUPLED NO-OP] vertex_tensorboard: using stub.
fsdp: 8
I0420 14:34:59.665766 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.665878 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.665915 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.665945 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.665970 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.665994 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666017 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666039 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666153 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666179 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666201 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666223 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666244 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666266 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666286 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666307 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666327 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666346 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666367 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666386 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666405 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666424 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666444 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666463 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666482 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666501 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666520 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666539 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666558 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666577 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666597 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666617 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666636 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666655 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666674 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666694 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666713 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666732 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666751 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666770 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666789 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666808 135337422638208 maxtext_utils.py:1821]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0420 14:34:59.666852 135337422638208 maxtext_utils.py:1821]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0420 14:34:59.666895 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/mlp/wi_0/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0420 14:34:59.666923 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/mlp/wi_1/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0420 14:34:59.666957 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/mlp/wo/kernel
    Shape:     float32[7168,16,2048]
    Logical:   P('mlp', 'layers', 'embed')
    Physical:  (None, None, 'fsdp')
I0420 14:34:59.666990 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/post_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0420 14:34:59.667013 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/pre_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0420 14:34:59.667063 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/self_attention/key/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0420 14:34:59.667102 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/self_attention/out/kernel
    Shape:     float32[16,16,128,2048]
    Logical:   P('heads', 'layers', 'kv', 'embed')
    Physical:  (None, None, None, 'fsdp')
I0420 14:34:59.667128 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/self_attention/query/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None, None)
I0420 14:34:59.667152 135337422638208 maxtext_utils.py:1821]  params/params/decoder/layers/self_attention/value/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0420 14:34:59.667185 135337422638208 maxtext_utils.py:1821]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0420 14:34:59.667216 135337422638208 maxtext_utils.py:1821]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0420 14:35:00.700753 135337422638208 train.py:157] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0420 14:35:00.700845 135337422638208 train.py:157] train/xent Physical: float32[8,2048]............................................. ('fsdp', None).
I0420 14:35:00.715340 135337422638208 train.py:164] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0420 14:35:00.715393 135337422638208 train.py:164] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None).
I0420 14:35:13.263944 135337422638208 max_utils.py:791] Total memory size: 3.5 GB, Output size: 1.5 GB, Temp size: 1.9 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB.
I0420 14:35:13.264810 135337422638208 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter.
I0420 14:35:13.265286 135337422638208 metric_logger.py:301] number parameters: 1.105 billion
I0420 14:35:30.024318 135337422638208 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0420 14:35:30.152127 135337422638208 checkpointing.py:798] Waited 0.12779593467712402 seconds for step 0 to finish before starting checkpointing.
I0420 14:35:30.152799 135337422638208 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0420 14:35:30.152983 135337422638208 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0
I0420 14:35:30.153524 135337422638208 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0.
I0420 14:35:30.249921 135337422638208 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient
I0420 14:35:30.339092 135222255420992 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0
I0420 14:35:30.351279 135337422638208 jax_array_handlers.py:347] Scheduling D2H of 81 prioritized jax.Array.
I0420 14:35:30.351385 135337422638208 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 14:35:31.050977 135222244935232 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/items
I0420 14:35:31.211580 135222205089344 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776695730967079123, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776695730967079123, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/_CHECKPOINT_METADATA
I0420 14:35:31.966053 1050442 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json
I0420 14:35:31.966113 1050442 google_auth_provider.cc:156] Using OAuth2 AuthProvider
I0420 14:35:32.394730 135337422638208 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 2.043992s
I0420 14:35:32.402410 135337422638208 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 5.737 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 2 seconds) (per-host)
I0420 14:35:32.402529 135337422638208 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 2.151431s (batch_requests_ready=0.092714s, total_serialization_initiated=2.051108s, others=0.007609s)
I0420 14:35:32.402614 135337422638208 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 2.152100s (all_items=0.000024s, per_item={'items': '0.00002408'}, temp_paths=2.152076)
I0420 14:35:32.404140 135222123300416 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0420 14:35:32.404249 135337422638208 async_checkpointer.py:561] Finished blocking save. Time taken: 2.251221s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0.
I0420 14:35:32.404449 135337422638208 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 14:35:32.404623 135222265906752 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0420 14:35:32.404721 135337422638208 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776695730.1527843, 'wait_for_prev_duration_secs': 4.506111145019531e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776695730.1530056, 'checkpointer_blocking_duration_secs': 2.2513580322265625, 'get_old_steps_start_time': 1776695732.4043832, 'get_old_steps_duration_secs': 2.8371810913085938e-05, 'checkpoint_manager_blocking_start_time': 1776695730.1526763, 'checkpoint_manager_blocking_duration_secs': 2.252018451690674}
I0420 14:35:32.404844 135337422638208 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0420 14:35:32.404892 135337422638208 max_utils.py:750] 
Memstats: After params initialized:
I0420 14:35:32.404943 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_0(process=0,(0,0,0,0))
I0420 14:35:32.404968 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_1(process=0,(1,0,0,0))
I0420 14:35:32.404987 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_2(process=0,(0,1,0,0))
I0420 14:35:32.405006 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_3(process=0,(1,1,0,0))
I0420 14:35:32.405024 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_4(process=0,(0,2,0,0))
I0420 14:35:32.405041 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_5(process=0,(1,2,0,0))
I0420 14:35:32.405073 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_6(process=0,(0,3,0,0))
I0420 14:35:32.405090 135337422638208 max_utils.py:756] 	Using (GB) 1.61 / 31.25 (5.152000%) on TPU_7(process=0,(1,3,0,0))
I0420 14:35:32.865178 135337422638208 metric_logger.py:196] completed step: 0, seconds: 16.758, TFLOP/s/device: 0.811, Tokens/s/device: 122.208, total_weights: 13328, loss: 10.844, lm_loss: 10.844, perplexity: 51212.484
I0420 14:35:32.866191 135337422638208 metric_logger.py:281] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/tensorboard/'
I0420 14:35:32.971439 135337422638208 metric_logger.py:196] completed step: 1, seconds: 2.837, TFLOP/s/device: 4.790, Tokens/s/device: 721.929, total_weights: 12332, loss: 10.871, lm_loss: 10.871, perplexity: 52634.344
I0420 14:35:33.081948 135337422638208 metric_logger.py:196] completed step: 2, seconds: 0.012, TFLOP/s/device: 1092.740, Tokens/s/device: 164709.667, total_weights: 15161, loss: 10.871, lm_loss: 10.871, perplexity: 52617.816
I0420 14:35:33.173391 135222223963712 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/items/array_metadatas/process_0
I0420 14:35:33.192338 135337422638208 metric_logger.py:196] completed step: 3, seconds: 0.108, TFLOP/s/device: 125.976, Tokens/s/device: 18988.457, total_weights: 13327, loss: 10.857, lm_loss: 10.857, perplexity: 51888.457
I0420 14:35:56.385498 135222144271936 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 23.982216s (commit=23.539574s, array_metadata_write=0.442642s)
I0420 14:35:56.386926 135222123300416 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 483.556 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 26 seconds) (per-host)
I0420 14:35:56.387066 135222123300416 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 23.982769s.
I0420 14:35:56.618220 135222123300416 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776695730967079123, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/_CHECKPOINT_METADATA
I0420 14:35:56.851248 135222123300416 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0420 14:35:57.032083 135222205089344 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776695730967079123, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/_CHECKPOINT_METADATA
I0420 14:35:57.105611 135337422638208 metric_logger.py:196] completed step: 4, seconds: 0.110, TFLOP/s/device: 123.062, Tokens/s/device: 18549.212, total_weights: 11939, loss: 10.861, lm_loss: 10.861, perplexity: 52094.562
I0420 14:35:57.115503 135337422638208 metric_logger.py:196] completed step: 5, seconds: 0.111, TFLOP/s/device: 122.755, Tokens/s/device: 18502.959, total_weights: 15502, loss: 10.882, lm_loss: 10.882, perplexity: 53230.324
I0420 14:35:57.224097 135337422638208 metric_logger.py:196] completed step: 6, seconds: 23.911, TFLOP/s/device: 0.568, Tokens/s/device: 85.651, total_weights: 13864, loss: 10.886, lm_loss: 10.886, perplexity: 53423.605
I0420 14:35:57.240285 135222123300416 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0420 14:35:57.240835 135222123300416 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.581857s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/items
I0420 14:35:57.241528 135222123300416 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0/items
I0420 14:35:57.334648 135337422638208 metric_logger.py:196] completed step: 7, seconds: 0.008, TFLOP/s/device: 1782.620, Tokens/s/device: 268695.880, total_weights: 12988, loss: 10.829, lm_loss: 10.829, perplexity: 50459.859
I0420 14:35:57.444977 135337422638208 metric_logger.py:196] completed step: 8, seconds: 0.110, TFLOP/s/device: 124.039, Tokens/s/device: 18696.537, total_weights: 13820, loss: 10.848, lm_loss: 10.848, perplexity: 51456.266
I0420 14:35:57.452083 135337422638208 checkpointing.py:794] Waiting for step 10 to finish before checkpoint...
I0420 14:35:57.482949 135222123300416 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0
I0420 14:35:57.665590 135337422638208 checkpointing.py:798] Waited 0.21346688270568848 seconds for step 10 to finish before starting checkpointing.
I0420 14:35:57.666108 135337422638208 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 14:35:58.156497 135222123300416 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0`.
I0420 14:35:58.157244 135222123300416 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 28.004224s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/0
I0420 14:35:58.157329 135222123300416 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 25.753034s.
I0420 14:35:58.157565 135222265906752 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 14:35:58.157699 135222265906752 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0420 14:35:58.157756 135222265906752 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0420 14:35:58.157810 135222265906752 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0420 14:35:58.157926 135337422638208 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
I0420 14:35:58.158166 135337422638208 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 10
I0420 14:35:58.158426 135337422638208 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10.
I0420 14:35:58.335930 135222123300416 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10
I0420 14:35:58.350982 135337422638208 jax_array_handlers.py:347] Scheduling D2H of 81 prioritized jax.Array.
I0420 14:35:58.351103 135337422638208 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 14:35:58.983905 135222112814656 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10/items
I0420 14:36:04.461129 135337422638208 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 6.110634s
I0420 14:36:04.468345 135337422638208 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 1.985 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 6 seconds) (per-host)
I0420 14:36:04.468418 135337422638208 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 6.218646s (batch_requests_ready=0.094511s, total_serialization_initiated=6.117026s, others=0.007108s)
I0420 14:36:04.468496 135337422638208 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 6.219233s (all_items=0.000019s, per_item={'items': '0.00001884'}, temp_paths=6.219214)
I0420 14:36:04.470879 135222123300416 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0420 14:36:04.470963 135337422638208 async_checkpointer.py:561] Finished blocking save. Time taken: 6.312751s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10.
I0420 14:36:04.471172 135337422638208 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=10] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 14:36:04.471409 135222223963712 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0420 14:36:04.471530 135337422638208 standard_logger.py:34] {'step': 10, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776695757.6660762, 'wait_for_prev_duration_secs': 0.49190545082092285, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776695758.1581929, 'checkpointer_blocking_duration_secs': 6.312890529632568, 'get_old_steps_start_time': 1776695764.4711018, 'get_old_steps_duration_secs': 3.123283386230469e-05, 'checkpoint_manager_blocking_start_time': 1776695757.6659858, 'checkpoint_manager_blocking_duration_secs': 6.805518865585327}
I0420 14:36:04.471754 135337422638208 checkpointing.py:409] Started an asynchronous checkpoint save for step 10
I0420 14:36:04.472429 135337422638208 metric_logger.py:196] completed step: 9, seconds: 0.111, TFLOP/s/device: 122.879, Tokens/s/device: 18521.701, total_weights: 12300, loss: 10.857, lm_loss: 10.857, perplexity: 51900.086
I0420 14:36:04.482439 135337422638208 metric_logger.py:196] completed step: 10, seconds: 0.110, TFLOP/s/device: 123.148, Tokens/s/device: 18562.158, total_weights: 13436, loss: 10.876, lm_loss: 10.876, perplexity: 52905.383
I0420 14:36:04.589745 135337422638208 metric_logger.py:196] completed step: 11, seconds: 7.028, TFLOP/s/device: 1.933, Tokens/s/device: 291.421, total_weights: 13817, loss: 10.853, lm_loss: 10.853, perplexity: 51714.605
I0420 14:36:04.700125 135337422638208 metric_logger.py:196] completed step: 12, seconds: 0.008, TFLOP/s/device: 1604.337, Tokens/s/device: 241823.120, total_weights: 12076, loss: 10.852, lm_loss: 10.852, perplexity: 51622.902
I0420 14:36:04.810446 135337422638208 metric_logger.py:196] completed step: 13, seconds: 0.109, TFLOP/s/device: 125.003, Tokens/s/device: 18841.886, total_weights: 11048, loss: 10.864, lm_loss: 10.864, perplexity: 52261.953
I0420 14:36:04.920795 135337422638208 metric_logger.py:196] completed step: 14, seconds: 0.110, TFLOP/s/device: 123.104, Tokens/s/device: 18555.599, total_weights: 14267, loss: 10.834, lm_loss: 10.834, perplexity: 50738.680
I0420 14:36:05.031036 135337422638208 metric_logger.py:196] completed step: 15, seconds: 0.110, TFLOP/s/device: 123.024, Tokens/s/device: 18543.502, total_weights: 13004, loss: 10.832, lm_loss: 10.832, perplexity: 50634.074
I0420 14:36:05.141405 135337422638208 metric_logger.py:196] completed step: 16, seconds: 0.110, TFLOP/s/device: 123.087, Tokens/s/device: 18553.077, total_weights: 13811, loss: 10.867, lm_loss: 10.867, perplexity: 52422.980
I0420 14:36:05.226071 135222133786176 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10/items/array_metadatas/process_0
I0420 14:36:05.251710 135337422638208 metric_logger.py:196] completed step: 17, seconds: 0.110, TFLOP/s/device: 123.243, Tokens/s/device: 18576.469, total_weights: 13709, loss: 10.850, lm_loss: 10.850, perplexity: 51525.906
I0420 14:36:05.361966 135337422638208 metric_logger.py:196] completed step: 18, seconds: 0.110, TFLOP/s/device: 123.178, Tokens/s/device: 18566.701, total_weights: 12831, loss: 10.839, lm_loss: 10.839, perplexity: 50989.945
I0420 14:36:05.472038 135337422638208 checkpointing.py:794] Waiting for step 19 to finish before checkpoint...
I0420 14:36:05.473764 135337422638208 checkpointing.py:798] Waited 0.0017545223236083984 seconds for step 19 to finish before starting checkpointing.
I0420 14:36:05.474088 135337422638208 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=10][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 14:36:30.922815 135222112814656 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 26.453689s (commit=26.027347s, array_metadata_write=0.426342s)
I0420 14:36:30.923967 135222123300416 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 386.792 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 32 seconds) (per-host)
I0420 14:36:30.924026 135222123300416 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 26.453018s.
I0420 14:36:31.475257 135222123300416 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0420 14:36:31.889145 135222123300416 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0420 14:36:31.889873 135222123300416 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.554931s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10/items
I0420 14:36:31.890681 135222123300416 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10/items
I0420 14:36:32.122090 135222123300416 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10
I0420 14:36:32.776033 135222123300416 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10`.
I0420 14:36:32.776754 135222123300416 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 34.618547s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/10
I0420 14:36:32.776844 135222123300416 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 28.305838s.
I0420 14:36:32.777010 135222223963712 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 14:36:32.777132 135222223963712 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0420 14:36:32.777185 135222223963712 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=10] CheckpointManager Save Finalize is syncing with other hosts...
I0420 14:36:32.777233 135222223963712 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=10] CheckpointManager Save Finalize is done on all hosts.
I0420 14:36:32.777401 135337422638208 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=10][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=10.
W0420 14:36:32.777531 135337422638208 checkpoint_manager.py:1441] Waiting for previous save to complete took 27.303449 seconds. If this number is high, consider checkpointing less frequently.
I0420 14:36:32.778247 135337422638208 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 19
I0420 14:36:32.778573 135337422638208 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19.
I0420 14:36:32.938914 135222223963712 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19
I0420 14:36:32.952609 135337422638208 jax_array_handlers.py:347] Scheduling D2H of 81 prioritized jax.Array.
I0420 14:36:32.952709 135337422638208 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0420 14:36:33.646913 135222265906752 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19/items
I0420 14:36:38.788471 135337422638208 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 5.836351s
I0420 14:36:38.796175 135337422638208 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 2.079 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 5 seconds) (per-host)
I0420 14:36:38.796236 135337422638208 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 5.936918s (batch_requests_ready=0.086805s, total_serialization_initiated=5.842538s, others=0.007575s)
I0420 14:36:38.796317 135337422638208 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 5.937482s (all_items=0.000012s, per_item={'items': '0.00001216'}, temp_paths=5.937470)
I0420 14:36:38.798013 135223509517888 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0420 14:36:38.798119 135337422638208 async_checkpointer.py:561] Finished blocking save. Time taken: 6.019830s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19.
I0420 14:36:38.798292 135337422638208 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=19] Starting CheckpointManager Save Finalize thread=save_finalize
I0420 14:36:38.798471 135222123300416 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0420 14:36:38.798588 135337422638208 standard_logger.py:34] {'step': 19, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776695765.4740608, 'wait_for_prev_duration_secs': 27.303448915481567, 'time_between_consecutive_saves_sec': 7.316235303878784, 'checkpointer_blocking_start_time': 1776695792.7782705, 'checkpointer_blocking_duration_secs': 6.019958019256592, 'get_old_steps_start_time': 1776695798.7982416, 'get_old_steps_duration_secs': 1.7642974853515625e-05, 'checkpoint_manager_blocking_start_time': 1776695765.4740114, 'checkpoint_manager_blocking_duration_secs': 33.32455253601074}
I0420 14:36:38.798792 135337422638208 checkpointing.py:409] Started an asynchronous checkpoint save for step 19
I0420 14:36:38.798827 135337422638208 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=19][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0420 14:36:39.564364 135231445141056 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19/items/array_metadatas/process_0
I0420 14:37:04.093059 135222112814656 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 25.295938s (commit=24.830667s, array_metadata_write=0.465270s)
I0420 14:37:04.094201 135223509517888 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 404.615 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 31 seconds) (per-host)
I0420 14:37:04.094253 135223509517888 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 25.296089s.
I0420 14:37:04.488839 135223509517888 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0420 14:37:04.941448 135223509517888 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0420 14:37:04.942291 135223509517888 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.594600s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19/items
I0420 14:37:04.943115 135223509517888 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19/items
I0420 14:37:05.183754 135223509517888 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19
I0420 14:37:05.879443 135223509517888 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19`.
I0420 14:37:05.880194 135223509517888 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 33.101912s. directory=gs://wanglance-maxtext/linen_ckpt_feat_nnx_trainstate_and_training_loop_20260420_142345/linen_feat_nnx_trainstate_and_training_loop_20260420_142345_05_fp8/checkpoints/19
I0420 14:37:05.880259 135223509517888 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 27.082096s.
I0420 14:37:05.880462 135222123300416 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0420 14:37:05.880584 135222123300416 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0420 14:37:05.880641 135222123300416 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=19] CheckpointManager Save Finalize is syncing with other hosts...
I0420 14:37:05.880686 135222123300416 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=19] CheckpointManager Save Finalize is done on all hosts.
I0420 14:37:05.880853 135337422638208 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=19][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=19.
I0420 14:37:05.881000 135337422638208 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0420 14:37:05.881902 135337422638208 metric_logger.py:196] completed step: 19, seconds: 0.110, TFLOP/s/device: 123.159, Tokens/s/device: 18563.840, total_weights: 13717, loss: 10.851, lm_loss: 10.851, perplexity: 51591.992
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops