MaxView

‹ 04_int8Case: 05_fp806_grad_accum ›

Metrics: main (b117f50cf) vs feat/nnx-trainstate-and-training-loop (f093f3730)

Metricmain  b117f50cffeat/nnx-trainstate-and-training-loop  f093f3730Diff (feat/nnx-trainstate-and-training-loop − main)
Parameters1.105 billion1.105 billion
Final loss6.16706.16700
TFLOP/s86.01585.801-0.214
Tok/s12965.112932.8-32.34
Avg s/step1.6491.597-0.052
Memory %1.441.440
JAX0.9.20.9.2

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  b117f50cf  ·  main_20260424_070227  ·  full log
XPK Start: Fri Apr 24 07:16:13 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-24 07:16:38.906120: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0424 07:16:39.109947 135241581811520 max_utils.py:273] Attempting to initialize the jax distributed system...
I0424 07:16:48.150578 135241581811520 distributed.py:149] Starting JAX distributed service on [::]:8482
I0424 07:16:48.153087 135241581811520 distributed.py:172] Connecting to JAX distributed service on mt-05-fp8-amnlf-slice-job-0-0.mt-05-fp8-amnlf:8482
I0424 07:16:49.220168 135241581811520 max_utils.py:284] Jax distributed system initialized!
I0424 07:16:55.308476 135241581811520 max_utils.py:800] System Information: Jax Version: 0.9.2
I0424 07:16:55.308587 135241581811520 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0424 07:16:55.308628 135241581811520 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0424 07:16:55.308665 135241581811520 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0424 07:16:56.013766 135241581811520 maxtext_utils.py:1604] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 07:16:56.014056 135241581811520 checkpointing.py:677] Setting up checkpoint logger...
I0424 07:16:56.014136 135241581811520 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0424 07:16:56.014193 135241581811520 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0424 07:16:56.014539 135241581811520 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7affaab51cd0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 07:16:58.872215 135241581811520 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0424 07:16:58.872453 135241581811520 checkpoint_manager.py:708] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7aeba0680770>}, handler_registry=None
I0424 07:16:58.872694 135241581811520 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7aeba0680770>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 07:16:58.872743 135241581811520 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7aeba06819d0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 07:16:58.872780 135241581811520 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7aeba0680770>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7aeba0680770>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7aeba06819d0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7aeba06819d0>}).
I0424 07:16:58.873124 135241581811520 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0424 07:16:58.873197 135241581811520 async_checkpointer.py:192] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7aeb70271940> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0424 07:16:59.954445 135241581811520 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints
I0424 07:17:00.042321 135241581811520 checkpoint_manager.py:929] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7aeba067ec00>
I0424 07:17:00.042496 135241581811520 checkpointing.py:301] Checkpoint manager created!
I0424 07:17:01.127010 135241581811520 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0424 07:17:01.127144 135241581811520 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 07:17:01.514118 135241581811520 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch_attn', 'activation_length_attn', 'activation_embed_attn').
I0424 07:17:01.514217 135241581811520 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 07:17:01.530772 135241581811520 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch_attn', 'activation_length_attn', 'activation_embed_attn').
I0424 07:17:01.530834 135241581811520 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 07:17:01.583370 135241581811520 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_length_attn', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 07:17:01.583455 135241581811520 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:17:01.600032 135241581811520 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_length_attn', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 07:17:01.600108 135241581811520 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:17:01.616657 135241581811520 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_length_attn', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 07:17:01.616721 135241581811520 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:17:01.641880 135241581811520 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch_attn', 'activation_length_attn', 'activation_heads', 'activation_kv').
I0424 07:17:01.641951 135241581811520 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 07:17:01.693524 135241581811520 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0424 07:17:01.693605 135241581811520 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0424 07:17:02.130938 135241581811520 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0424 07:17:02.131054 135241581811520 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0424 07:17:04.067434 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067562 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067613 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067655 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067691 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067724 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067755 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067786 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067817 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067847 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067876 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067904 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067937 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.067975 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068006 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068034 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068062 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068090 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068133 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068163 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068192 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068219 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068250 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068278 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068304 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068330 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068356 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068383 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068409 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068435 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068462 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068487 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068515 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068541 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068567 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068594 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068619 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068645 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068670 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068696 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068723 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068750 135241581811520 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 07:17:04.068801 135241581811520 maxtext_utils.py:1707]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 07:17:04.068854 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/mlp/wi_0/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0424 07:17:04.068892 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/mlp/wi_1/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0424 07:17:04.068939 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/mlp/wo/kernel
    Shape:     float32[7168,16,2048]
    Logical:   P('mlp', 'layers', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 07:17:04.068984 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/post_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0424 07:17:04.069017 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/pre_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0424 07:17:04.069067 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/key/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0424 07:17:04.069157 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/out/kernel
    Shape:     float32[16,16,128,2048]
    Logical:   P('heads', 'layers', 'kv', 'embed')
    Physical:  (None, None, None, 'fsdp')
I0424 07:17:04.069201 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/query/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None, None)
I0424 07:17:04.069240 135241581811520 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/value/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0424 07:17:04.069286 135241581811520 maxtext_utils.py:1707]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0424 07:17:04.069328 135241581811520 maxtext_utils.py:1707]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0424 07:17:05.690122 135241581811520 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 07:17:05.690218 135241581811520 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 07:17:05.706003 135241581811520 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 07:17:05.706061 135241581811520 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 07:17:19.373075 135241581811520 max_utils.py:791] Total memory size: 1.5 GB, Output size: 0.4 GB, Temp size: 1.1 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0424 07:17:19.373909 135241581811520 metric_logger.py:301] number parameters: 1.105 billion
I0424 07:17:34.326419 135241581811520 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0424 07:17:34.498445 135241581811520 checkpointing.py:776] Waited 0.17200279235839844 seconds for step 0 to finish before starting checkpointing.
I0424 07:17:34.500856 135241581811520 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 07:17:34.502762 135241581811520 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 0
I0424 07:17:34.504157 135241581811520 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/0.
I0424 07:17:35.047203 135241581811520 signaling_client.py:364] Using JaxDistributedSignalingClient
I0424 07:17:35.048027 135241581811520 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0424 07:17:35.048084 135241581811520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 07:17:35.461541 135241581811520 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.414474s
I0424 07:17:35.461718 135241581811520 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 3.647 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.4229912757873535 s) (per-host)
I0424 07:17:35.461777 135241581811520 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.423056s (batch_requests_ready=0.003723s, total_serialization_initiated=0.419260s, others=0.000073s)
I0424 07:17:35.461917 135241581811520 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.427153s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=0.427136)
I0424 07:17:35.462728 135241581811520 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.96 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/0.
I0424 07:17:35.463014 135112326481664 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 07:37:35.462991
I0424 07:17:35.472735 135241581811520 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 07:17:35.473038 135111788836608 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0424 07:17:35.473210 135241581811520 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777015054.500838, 'wait_for_prev_duration_secs': 6.0558319091796875e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777015054.5028007, 'checkpointer_blocking_duration_secs': 0.960334300994873, 'get_old_steps_start_time': 1777015055.4631612, 'get_old_steps_duration_secs': 2.956390380859375e-05, 'checkpoint_manager_blocking_start_time': 1777015054.4990382, 'checkpoint_manager_blocking_duration_secs': 0.9741344451904297}
I0424 07:17:35.473321 135241581811520 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0424 07:17:35.473370 135241581811520 max_utils.py:750] 
Memstats: After params initialized:
I0424 07:17:35.473417 135241581811520 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_24(process=6,(0,6,0,0))
I0424 07:17:35.473449 135241581811520 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_25(process=6,(1,6,0,0))
I0424 07:17:35.473476 135241581811520 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_28(process=6,(0,7,0,0))
I0424 07:17:35.473499 135241581811520 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_29(process=6,(1,7,0,0))
I0424 07:17:35.786211 135241581811520 metric_logger.py:196] completed step: 0, seconds: 14.952, TFLOP/s/device: 0.909, Tokens/s/device: 136.968, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52776.805
I0424 07:17:35.984260 135241581811520 metric_logger.py:196] completed step: 1, seconds: 1.458, TFLOP/s/device: 9.321, Tokens/s/device: 1404.939, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52761.840
I0424 07:17:36.413328 135241581811520 metric_logger.py:196] completed step: 2, seconds: 0.041, TFLOP/s/device: 335.046, Tokens/s/device: 50501.812, total_weights: 65536, loss: 10.816, lm_loss: 10.816, perplexity: 49832.398
I0424 07:17:36.571463 135241581811520 metric_logger.py:196] completed step: 3, seconds: 0.429, TFLOP/s/device: 31.673, Tokens/s/device: 4774.126, total_weights: 65536, loss: 10.431, lm_loss: 10.431, perplexity: 33901.820
I0424 07:17:36.890049 135241581811520 metric_logger.py:196] completed step: 4, seconds: 0.165, TFLOP/s/device: 82.576, Tokens/s/device: 12446.821, total_weights: 65536, loss: 9.992, lm_loss: 9.992, perplexity: 21847.639
I0424 07:17:36.896487 135241581811520 metric_logger.py:196] completed step: 5, seconds: 0.158, TFLOP/s/device: 85.973, Tokens/s/device: 12958.745, total_weights: 65536, loss: 9.549, lm_loss: 9.549, perplexity: 14035.006
I0424 07:17:38.228162    2846 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0424 07:17:40.311240 135112304158464 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/0/items/array_metadatas/process_6
I0424 07:17:55.498757 135112326481664 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 77.212 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 20.460005521774292 s) (per-host)
I0424 07:17:55.498865 135112326481664 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 20.035777s.
I0424 07:18:01.565980 135241581811520 metric_logger.py:196] completed step: 6, seconds: 0.319, TFLOP/s/device: 42.601, Tokens/s/device: 6421.270, total_weights: 65536, loss: 9.111, lm_loss: 9.111, perplexity: 9058.688
I0424 07:18:01.724355 135241581811520 metric_logger.py:196] completed step: 7, seconds: 24.512, TFLOP/s/device: 0.554, Tokens/s/device: 83.549, total_weights: 65536, loss: 8.685, lm_loss: 8.685, perplexity: 5914.720
I0424 07:18:01.882939 135241581811520 metric_logger.py:196] completed step: 8, seconds: 0.163, TFLOP/s/device: 83.142, Tokens/s/device: 12532.126, total_weights: 65536, loss: 8.281, lm_loss: 8.281, perplexity: 3950.010
I0424 07:18:01.888053 135241581811520 checkpointing.py:772] Waiting for step 10 to finish before checkpoint...
I0424 07:18:02.199653 135241581811520 checkpointing.py:776] Waited 0.3115661144256592 seconds for step 10 to finish before starting checkpointing.
I0424 07:18:02.202641 135241581811520 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 07:18:03.218018 135112326481664 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 27.754913s.
I0424 07:18:03.218325 135111788836608 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 07:18:03.218441 135111788836608 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0424 07:18:03.218493 135111788836608 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0424 07:18:03.220208 135111788836608 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0424 07:18:03.220378 135241581811520 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0424 07:18:03.220516 135241581811520 checkpoint_manager.py:1452] Waiting for previous save to complete took 1.017873 seconds. If this number is high, consider checkpointing less frequently.
I0424 07:18:03.222280 135241581811520 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 10
I0424 07:18:03.224312 135241581811520 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/10.
I0424 07:18:03.930805 135241581811520 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0424 07:18:03.930902 135241581811520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 07:18:03.984689 135241581811520 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.054982s
I0424 07:18:03.984860 135241581811520 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 25.248 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.06110334396362305 s) (per-host)
I0424 07:18:03.984914 135241581811520 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.061166s (batch_requests_ready=0.003020s, total_serialization_initiated=0.058077s, others=0.000069s)
I0424 07:18:03.985020 135241581811520 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.065445s (all_items=0.000015s, per_item={'items': '0.00001502'}, temp_paths=0.065430)
I0424 07:18:03.985814 135241581811520 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.76 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/10.
I0424 07:18:03.986178 135111788836608 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 07:38:03.986140
I0424 07:18:03.988260 135241581811520 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=10] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 07:18:03.988488 135110695294720 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0424 07:18:03.988599 135241581811520 standard_logger.py:34] {'step': 10, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777015082.2026112, 'wait_for_prev_duration_secs': 1.0178725719451904, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777015083.222326, 'checkpointer_blocking_duration_secs': 0.7640020847320557, 'get_old_steps_start_time': 1777015083.9863558, 'get_old_steps_duration_secs': 3.123283386230469e-05, 'checkpoint_manager_blocking_start_time': 1777015082.2005043, 'checkpoint_manager_blocking_duration_secs': 1.7880606651306152}
I0424 07:18:03.988771 135241581811520 checkpointing.py:408] Started an asynchronous checkpoint save for step 10
I0424 07:18:03.989463 135241581811520 metric_logger.py:196] completed step: 9, seconds: 0.158, TFLOP/s/device: 85.846, Tokens/s/device: 12939.586, total_weights: 65536, loss: 7.908, lm_loss: 7.908, perplexity: 2720.116
I0424 07:18:03.996210 135241581811520 metric_logger.py:196] completed step: 10, seconds: 0.158, TFLOP/s/device: 85.739, Tokens/s/device: 12923.582, total_weights: 65536, loss: 7.568, lm_loss: 7.568, perplexity: 1936.198
I0424 07:18:04.709203 135241581811520 metric_logger.py:196] completed step: 11, seconds: 2.106, TFLOP/s/device: 6.451, Tokens/s/device: 972.427, total_weights: 65536, loss: 7.288, lm_loss: 7.288, perplexity: 1463.272
I0424 07:18:04.867572 135241581811520 metric_logger.py:196] completed step: 12, seconds: 0.557, TFLOP/s/device: 24.397, Tokens/s/device: 3677.368, total_weights: 65536, loss: 7.032, lm_loss: 7.032, perplexity: 1132.061
I0424 07:18:05.025725 135241581811520 metric_logger.py:196] completed step: 13, seconds: 0.163, TFLOP/s/device: 83.272, Tokens/s/device: 12551.711, total_weights: 65536, loss: 6.815, lm_loss: 6.815, perplexity: 911.031
I0424 07:18:05.184021 135241581811520 metric_logger.py:196] completed step: 14, seconds: 0.158, TFLOP/s/device: 85.893, Tokens/s/device: 12946.702, total_weights: 65536, loss: 6.635, lm_loss: 6.635, perplexity: 761.080
I0424 07:18:05.342390 135241581811520 metric_logger.py:196] completed step: 15, seconds: 0.158, TFLOP/s/device: 85.888, Tokens/s/device: 12946.048, total_weights: 65536, loss: 6.492, lm_loss: 6.492, perplexity: 659.915
I0424 07:18:05.500866 135241581811520 metric_logger.py:196] completed step: 16, seconds: 0.158, TFLOP/s/device: 85.861, Tokens/s/device: 12941.957, total_weights: 65536, loss: 6.380, lm_loss: 6.380, perplexity: 589.948
I0424 07:18:05.658889 135241581811520 metric_logger.py:196] completed step: 17, seconds: 0.158, TFLOP/s/device: 85.724, Tokens/s/device: 12921.217, total_weights: 65536, loss: 6.293, lm_loss: 6.293, perplexity: 540.941
I0424 07:18:05.817164 135241581811520 metric_logger.py:196] completed step: 18, seconds: 0.158, TFLOP/s/device: 85.805, Tokens/s/device: 12933.539, total_weights: 65536, loss: 6.223, lm_loss: 6.223, perplexity: 504.108
I0424 07:18:05.974606 135241581811520 checkpointing.py:772] Waiting for step 19 to finish before checkpoint...
I0424 07:18:05.975650 135241581811520 checkpointing.py:776] Waited 0.0010619163513183594 seconds for step 19 to finish before starting checkpointing.
I0424 07:18:05.977705 135241581811520 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=10][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 07:18:09.560707 135108572989184 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/10/items/array_metadatas/process_6
I0424 07:18:46.978321 135111788836608 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 36.692 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.05452346801758 s) (per-host)
I0424 07:18:46.978438 135111788836608 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.992144s.
I0424 07:18:56.202762 135111788836608 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 52.216452s.
I0424 07:18:56.203009 135110695294720 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 07:18:56.203065 135110695294720 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0424 07:18:56.203136 135110695294720 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=10] CheckpointManager Save Finalize is syncing with other hosts...
I0424 07:18:56.205470 135110695294720 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=10] CheckpointManager Save Finalize is done on all hosts.
I0424 07:18:56.205667 135241581811520 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=10][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=10.
W0424 07:18:56.205807 135241581811520 checkpoint_manager.py:1452] Waiting for previous save to complete took 50.228107 seconds. If this number is high, consider checkpointing less frequently.
I0424 07:18:56.207977 135241581811520 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 19
I0424 07:18:56.210078 135241581811520 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/19.
I0424 07:18:56.492932 135241581811520 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0424 07:18:56.493024 135241581811520 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 07:18:56.546404 135241581811520 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.054349s
I0424 07:18:56.546578 135241581811520 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 25.534 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.060419321060180664 s) (per-host)
I0424 07:18:56.546629 135241581811520 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.060486s (batch_requests_ready=0.002997s, total_serialization_initiated=0.057418s, others=0.000071s)
I0424 07:18:56.546733 135241581811520 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.064603s (all_items=0.000012s, per_item={'items': '0.00001192'}, temp_paths=0.064591)
I0424 07:18:56.547446 135241581811520 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.34 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/19.
I0424 07:18:56.547727 135110695294720 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 07:38:56.547703
I0424 07:18:56.549678 135241581811520 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=19] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 07:18:56.549899 135110175168256 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0424 07:18:56.550027 135241581811520 standard_logger.py:34] {'step': 19, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777015085.977674, 'wait_for_prev_duration_secs': 50.22810745239258, 'time_between_consecutive_saves_sec': 2.757427215576172, 'checkpointer_blocking_start_time': 1777015136.2080166, 'checkpointer_blocking_duration_secs': 0.3398113250732422, 'get_old_steps_start_time': 1777015136.5478475, 'get_old_steps_duration_secs': 2.5272369384765625e-05, 'checkpoint_manager_blocking_start_time': 1777015085.975933, 'checkpoint_manager_blocking_duration_secs': 50.57406497001648}
I0424 07:18:56.550189 135241581811520 checkpointing.py:408] Started an asynchronous checkpoint save for step 19
I0424 07:18:56.550232 135241581811520 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=19][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 07:19:01.647948 135108572989184 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260424_070227/linen_xpk_main_20260424_070227_05_fp8/checkpoints/19/items/array_metadatas/process_6
I0424 07:19:38.735715 135110695294720 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 37.391 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.249523401260376 s) (per-host)
I0424 07:19:38.735821 135110695294720 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.188017s.
I0424 07:19:46.117866 135110695294720 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 49.570046s.
I0424 07:19:46.118179 135110175168256 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 07:19:46.118307 135110175168256 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0424 07:19:46.118350 135110175168256 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=19] CheckpointManager Save Finalize is syncing with other hosts...
I0424 07:19:46.119984 135110175168256 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=19] CheckpointManager Save Finalize is done on all hosts.
I0424 07:19:46.120207 135241581811520 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=19][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=19.
I0424 07:19:46.120360 135241581811520 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 07:19:46.121354 135241581811520 metric_logger.py:196] completed step: 19, seconds: 0.158, TFLOP/s/device: 86.015, Tokens/s/device: 12965.144, total_weights: 65536, loss: 6.167, lm_loss: 6.167, perplexity: 476.947
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Fri Apr 24 07:19:59 UTC 2026
EXIT_CODE=0
XPK Start: Fri Apr 24 09:22:26 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-24 09:22:51.727103: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0424 09:22:51.937585 140196258953024 max_utils.py:273] Attempting to initialize the jax distributed system...
I0424 09:23:00.979468 140196258953024 distributed.py:149] Starting JAX distributed service on [::]:8482
I0424 09:23:00.981872 140196258953024 distributed.py:172] Connecting to JAX distributed service on mt-05-fp8-py7s1-slice-job-0-0.mt-05-fp8-py7s1:8482
I0424 09:23:02.958386 140196258953024 max_utils.py:284] Jax distributed system initialized!
I0424 09:23:09.067210 140196258953024 max_utils.py:800] System Information: Jax Version: 0.9.2
I0424 09:23:09.067317 140196258953024 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0424 09:23:09.067361 140196258953024 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0424 09:23:09.067397 140196258953024 train_utils.py:391] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0424 09:23:09.757509 140196258953024 maxtext_utils.py:1732] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 09:23:09.758111 140196258953024 maxtext_utils.py:1732] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 09:23:09.758301 140196258953024 checkpointing.py:688] Setting up checkpoint logger...
I0424 09:23:09.758353 140196258953024 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0424 09:23:09.758396 140196258953024 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0424 09:23:09.758750 140196258953024 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7f817c319070>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 09:23:12.816049 140196258953024 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0424 09:23:12.816284 140196258953024 checkpoint_manager.py:708] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f6c204ae6c0>}, handler_registry=None
I0424 09:23:12.816519 140196258953024 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f6c204ae6c0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 09:23:12.816566 140196258953024 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f6d040c9dc0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 09:23:12.816602 140196258953024 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f6c204ae6c0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f6c204ae6c0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f6d040c9dc0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f6d040c9dc0>}).
I0424 09:23:12.816959 140196258953024 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0424 09:23:12.817031 140196258953024 async_checkpointer.py:192] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7f6c404c9d00> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0424 09:23:13.853906 140196258953024 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints
I0424 09:23:13.856138 140196258953024 checkpoint_manager.py:929] [process=5][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7f6d040c7c20>
I0424 09:23:13.856249 140196258953024 checkpointing.py:302] Checkpoint manager created!
I0424 09:23:14.961193 140196258953024 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0424 09:23:14.961307 140196258953024 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 09:23:15.347503 140196258953024 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0424 09:23:15.347598 140196258953024 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 09:23:15.369243 140196258953024 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0424 09:23:15.369337 140196258953024 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0424 09:23:15.422025 140196258953024 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 09:23:15.422108 140196258953024 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 09:23:15.438721 140196258953024 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 09:23:15.438780 140196258953024 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 09:23:15.455378 140196258953024 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0424 09:23:15.455437 140196258953024 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 09:23:15.480733 140196258953024 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0424 09:23:15.480798 140196258953024 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0424 09:23:15.531865 140196258953024 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0424 09:23:15.532038 140196258953024 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0424 09:23:15.990643 140196258953024 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0424 09:23:15.990800 140196258953024 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0424 09:23:17.912533 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912668 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912720 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912759 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912795 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912828 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912860 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912891 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912920 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912949 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.912984 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913017 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913047 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913076 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913105 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913133 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913162 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913191 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913218 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913246 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913273 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913300 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913327 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913354 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913381 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913407 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913434 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913461 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913488 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913516 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913542 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913569 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913596 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913622 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913648 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913691 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913719 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913746 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913775 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913804 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913831 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913859 140196258953024 maxtext_utils.py:1835]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0424 09:23:17.913910 140196258953024 maxtext_utils.py:1835]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0424 09:23:17.913988 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/mlp/wi_0/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0424 09:23:17.914032 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/mlp/wi_1/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0424 09:23:17.914082 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/mlp/wo/kernel
    Shape:     float32[7168,16,2048]
    Logical:   P('mlp', 'layers', 'embed')
    Physical:  (None, None, 'fsdp')
I0424 09:23:17.914126 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/post_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0424 09:23:17.914158 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/pre_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0424 09:23:17.914222 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/self_attention/key/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0424 09:23:17.914296 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/self_attention/out/kernel
    Shape:     float32[16,16,128,2048]
    Logical:   P('heads', 'layers', 'kv', 'embed')
    Physical:  (None, None, None, 'fsdp')
I0424 09:23:17.914336 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/self_attention/query/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None, None)
I0424 09:23:17.914370 140196258953024 maxtext_utils.py:1835]  params/params/decoder/layers/self_attention/value/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0424 09:23:17.914414 140196258953024 maxtext_utils.py:1835]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)

I0424 09:23:17.914458 140196258953024 maxtext_utils.py:1835]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')
I0424 09:23:19.536472 140196258953024 train.py:157] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 09:23:19.536570 140196258953024 train.py:157] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 09:23:19.552015 140196258953024 train.py:164] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0424 09:23:19.552076 140196258953024 train.py:164] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0424 09:23:33.320062 140196258953024 max_utils.py:791] Total memory size: 1.5 GB, Output size: 0.4 GB, Temp size: 1.1 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0424 09:23:33.320875 140196258953024 metric_logger.py:301] number parameters: 1.105 billion
I0424 09:23:48.616636 140196258953024 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0424 09:23:48.786233 140196258953024 checkpointing.py:798] Waited 0.1695706844329834 seconds for step 0 to finish before starting checkpointing.
I0424 09:23:48.788624 140196258953024 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 09:23:48.790459 140196258953024 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 0
I0424 09:23:48.791959 140196258953024 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/0.
I0424 09:23:49.134750 140196258953024 signaling_client.py:364] Using JaxDistributedSignalingClient
I0424 09:23:49.135808 140196258953024 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0424 09:23:49.135923 140196258953024 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 09:23:49.546353 140196258953024 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.411734s
I0424 09:23:49.546519 140196258953024 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 3.675 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.4197962284088135 s) (per-host)
I0424 09:23:49.546569 140196258953024 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.419856s (batch_requests_ready=0.003329s, total_serialization_initiated=0.416460s, others=0.000067s)
I0424 09:23:49.546662 140196258953024 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.423874s (all_items=0.000017s, per_item={'items': '0.00001740'}, temp_paths=0.423856)
I0424 09:23:49.547414 140196258953024 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.76 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/0.
I0424 09:23:49.547762 140066552411904 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 09:43:49.547725
I0424 09:23:49.563384 140196258953024 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 09:23:49.563730 140066051565312 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0424 09:23:49.563892 140196258953024 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777022628.788606, 'wait_for_prev_duration_secs': 7.176399230957031e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777022628.7904968, 'checkpointer_blocking_duration_secs': 0.757415771484375, 'get_old_steps_start_time': 1777022629.5479352, 'get_old_steps_duration_secs': 3.600120544433594e-05, 'checkpoint_manager_blocking_start_time': 1777022628.7868502, 'checkpoint_manager_blocking_duration_secs': 0.7769999504089355}
I0424 09:23:49.564009 140196258953024 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0424 09:23:49.564061 140196258953024 max_utils.py:750] 
Memstats: After params initialized:
I0424 09:23:49.564113 140196258953024 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_18(process=5,(2,4,0,0))
I0424 09:23:49.564147 140196258953024 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_19(process=5,(3,4,0,0))
I0424 09:23:49.564174 140196258953024 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_22(process=5,(2,5,0,0))
I0424 09:23:49.564198 140196258953024 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_23(process=5,(3,5,0,0))
I0424 09:23:49.879487 140196258953024 metric_logger.py:196] completed step: 0, seconds: 15.296, TFLOP/s/device: 0.888, Tokens/s/device: 133.894, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52776.805
I0424 09:23:50.063965 140196258953024 metric_logger.py:196] completed step: 1, seconds: 1.261, TFLOP/s/device: 10.776, Tokens/s/device: 1624.213, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52761.840
I0424 09:23:50.493400 140196258953024 metric_logger.py:196] completed step: 2, seconds: 0.027, TFLOP/s/device: 505.699, Tokens/s/device: 76224.505, total_weights: 65536, loss: 10.816, lm_loss: 10.816, perplexity: 49832.398
I0424 09:23:50.651612 140196258953024 metric_logger.py:196] completed step: 3, seconds: 0.430, TFLOP/s/device: 31.616, Tokens/s/device: 4765.506, total_weights: 65536, loss: 10.431, lm_loss: 10.431, perplexity: 33901.820
I0424 09:23:50.969015 140196258953024 metric_logger.py:196] completed step: 4, seconds: 0.164, TFLOP/s/device: 82.807, Tokens/s/device: 12481.564, total_weights: 65536, loss: 9.992, lm_loss: 9.992, perplexity: 21847.639
I0424 09:23:50.975491 140196258953024 metric_logger.py:196] completed step: 5, seconds: 0.158, TFLOP/s/device: 85.943, Tokens/s/device: 12954.318, total_weights: 65536, loss: 9.549, lm_loss: 9.549, perplexity: 14035.006
I0424 09:23:52.793588    2816 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0424 09:23:54.883628 140066059958016 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/0/items/array_metadatas/process_5
I0424 09:24:10.252106 140066552411904 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 74.780 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 21.125354766845703 s) (per-host)
I0424 09:24:10.252225 140066552411904 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 20.704350s.
I0424 09:24:15.155403 140196258953024 metric_logger.py:196] completed step: 6, seconds: 0.318, TFLOP/s/device: 42.740, Tokens/s/device: 6442.257, total_weights: 65536, loss: 9.111, lm_loss: 9.111, perplexity: 9058.688
I0424 09:24:15.313784 140196258953024 metric_logger.py:196] completed step: 7, seconds: 24.023, TFLOP/s/device: 0.566, Tokens/s/device: 85.252, total_weights: 65536, loss: 8.685, lm_loss: 8.685, perplexity: 5914.720
I0424 09:24:15.472202 140196258953024 metric_logger.py:196] completed step: 8, seconds: 0.163, TFLOP/s/device: 83.206, Tokens/s/device: 12541.719, total_weights: 65536, loss: 8.281, lm_loss: 8.281, perplexity: 3950.010
I0424 09:24:15.477278 140196258953024 checkpointing.py:794] Waiting for step 10 to finish before checkpoint...
I0424 09:24:15.789172 140196258953024 checkpointing.py:798] Waited 0.3118577003479004 seconds for step 10 to finish before starting checkpointing.
I0424 09:24:15.791851 140196258953024 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 09:24:16.937820 140066552411904 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 27.389929s.
I0424 09:24:16.938132 140066051565312 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 09:24:16.938246 140066051565312 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0424 09:24:16.938296 140066051565312 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0424 09:24:16.939941 140066051565312 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0424 09:24:16.940117 140196258953024 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0424 09:24:16.940255 140196258953024 checkpoint_manager.py:1452] Waiting for previous save to complete took 1.148402 seconds. If this number is high, consider checkpointing less frequently.
I0424 09:24:16.942089 140196258953024 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 10
I0424 09:24:16.944142 140196258953024 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/10.
I0424 09:24:17.373162 140196258953024 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0424 09:24:17.373357 140196258953024 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 09:24:17.423757 140196258953024 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.051566s
I0424 09:24:17.423930 140196258953024 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 26.757 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.05765700340270996 s) (per-host)
I0424 09:24:17.423981 140196258953024 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.057726s (batch_requests_ready=0.003011s, total_serialization_initiated=0.054640s, others=0.000075s)
I0424 09:24:17.424085 140196258953024 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.061802s (all_items=0.000015s, per_item={'items': '0.00001478'}, temp_paths=0.061787)
I0424 09:24:17.424801 140196258953024 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.48 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/10.
I0424 09:24:17.425150 140066051565312 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 09:44:17.425113
I0424 09:24:17.427245 140196258953024 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=10] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 09:24:17.427530 140063908292352 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0424 09:24:17.427703 140196258953024 standard_logger.py:34] {'step': 10, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777022655.791821, 'wait_for_prev_duration_secs': 1.148402214050293, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1777022656.9421268, 'checkpointer_blocking_duration_secs': 0.483168363571167, 'get_old_steps_start_time': 1777022657.4253156, 'get_old_steps_duration_secs': 3.147125244140625e-05, 'checkpoint_manager_blocking_start_time': 1777022655.7900095, 'checkpoint_manager_blocking_duration_secs': 1.6376583576202393}
I0424 09:24:17.427811 140196258953024 checkpointing.py:409] Started an asynchronous checkpoint save for step 10
I0424 09:24:17.428472 140196258953024 metric_logger.py:196] completed step: 9, seconds: 0.158, TFLOP/s/device: 85.936, Tokens/s/device: 12953.171, total_weights: 65536, loss: 7.908, lm_loss: 7.908, perplexity: 2720.116
I0424 09:24:17.435773 140196258953024 metric_logger.py:196] completed step: 10, seconds: 0.159, TFLOP/s/device: 85.707, Tokens/s/device: 12918.690, total_weights: 65536, loss: 7.568, lm_loss: 7.568, perplexity: 1936.198
I0424 09:24:17.593841 140196258953024 metric_logger.py:196] completed step: 11, seconds: 1.956, TFLOP/s/device: 6.945, Tokens/s/device: 1046.900, total_weights: 65536, loss: 7.288, lm_loss: 7.288, perplexity: 1463.272
I0424 09:24:18.163043 140196258953024 metric_logger.py:196] completed step: 12, seconds: 0.006, TFLOP/s/device: 2147.823, Tokens/s/device: 323743.282, total_weights: 65536, loss: 7.032, lm_loss: 7.032, perplexity: 1132.061
I0424 09:24:18.321438 140196258953024 metric_logger.py:196] completed step: 13, seconds: 0.565, TFLOP/s/device: 24.047, Tokens/s/device: 3624.612, total_weights: 65536, loss: 6.815, lm_loss: 6.815, perplexity: 911.031
I0424 09:24:18.479888 140196258953024 metric_logger.py:196] completed step: 14, seconds: 0.163, TFLOP/s/device: 83.255, Tokens/s/device: 12549.173, total_weights: 65536, loss: 6.635, lm_loss: 6.635, perplexity: 761.080
I0424 09:24:18.638123 140196258953024 metric_logger.py:196] completed step: 15, seconds: 0.158, TFLOP/s/device: 85.827, Tokens/s/device: 12936.807, total_weights: 65536, loss: 6.492, lm_loss: 6.492, perplexity: 659.915
I0424 09:24:18.796403 140196258953024 metric_logger.py:196] completed step: 16, seconds: 0.158, TFLOP/s/device: 85.764, Tokens/s/device: 12927.253, total_weights: 65536, loss: 6.380, lm_loss: 6.380, perplexity: 589.948
I0424 09:24:18.954636 140196258953024 metric_logger.py:196] completed step: 17, seconds: 0.159, TFLOP/s/device: 85.561, Tokens/s/device: 12896.725, total_weights: 65536, loss: 6.293, lm_loss: 6.293, perplexity: 540.941
I0424 09:24:19.113079 140196258953024 metric_logger.py:196] completed step: 18, seconds: 0.158, TFLOP/s/device: 86.218, Tokens/s/device: 12995.748, total_weights: 65536, loss: 6.223, lm_loss: 6.223, perplexity: 504.108
I0424 09:24:19.270721 140196258953024 checkpointing.py:794] Waiting for step 19 to finish before checkpoint...
I0424 09:24:19.271690 140196258953024 checkpointing.py:798] Waited 0.0009891986846923828 seconds for step 19 to finish before starting checkpointing.
I0424 09:24:19.274002 140196258953024 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=10][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 09:24:23.534260 140062797149952 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/10/items/array_metadatas/process_5
I0424 09:25:00.560889 140066051565312 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 36.573 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.19458556175232 s) (per-host)
I0424 09:25:00.561025 140066051565312 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 43.135759s.
I0424 09:25:07.008139 140066051565312 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 49.582859s.
I0424 09:25:07.008398 140063908292352 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 09:25:07.008465 140063908292352 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0424 09:25:07.008535 140063908292352 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=10] CheckpointManager Save Finalize is syncing with other hosts...
I0424 09:25:07.010745 140063908292352 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=10] CheckpointManager Save Finalize is done on all hosts.
I0424 09:25:07.010946 140196258953024 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=10][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=10.
W0424 09:25:07.011092 140196258953024 checkpoint_manager.py:1452] Waiting for previous save to complete took 47.737100 seconds. If this number is high, consider checkpointing less frequently.
I0424 09:25:07.012726 140196258953024 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 19
I0424 09:25:07.014789 140196258953024 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/19.
I0424 09:25:07.396072 140196258953024 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0424 09:25:07.396164 140196258953024 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0424 09:25:07.445669 140196258953024 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.050565s
I0424 09:25:07.445829 140196258953024 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 27.262 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.056589603424072266 s) (per-host)
I0424 09:25:07.445878 140196258953024 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.056649s (batch_requests_ready=0.002958s, total_serialization_initiated=0.053627s, others=0.000065s)
I0424 09:25:07.445985 140196258953024 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.060974s (all_items=0.000010s, per_item={'items': '0.00001049'}, temp_paths=0.060963)
I0424 09:25:07.446689 140196258953024 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.43 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/19.
I0424 09:25:07.447016 140063908292352 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-24 09:45:07.446978
I0424 09:25:07.449012 140196258953024 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=19] Starting CheckpointManager Save Finalize thread=save_finalize
I0424 09:25:07.449266 140062797149952 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0424 09:25:07.449401 140196258953024 standard_logger.py:34] {'step': 19, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1777022659.2739675, 'wait_for_prev_duration_secs': 47.73709988594055, 'time_between_consecutive_saves_sec': 2.3339877128601074, 'checkpointer_blocking_start_time': 1777022707.0127673, 'checkpointer_blocking_duration_secs': 0.43439221382141113, 'get_old_steps_start_time': 1777022707.4471781, 'get_old_steps_duration_secs': 2.574920654296875e-05, 'checkpoint_manager_blocking_start_time': 1777022659.2719598, 'checkpoint_manager_blocking_duration_secs': 48.17741012573242}
I0424 09:25:07.449560 140196258953024 checkpointing.py:409] Started an asynchronous checkpoint save for step 19
I0424 09:25:07.449603 140196258953024 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=19][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0424 09:25:12.577094 140063363028736 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312/linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091312_05_fp8/checkpoints/19/items/array_metadatas/process_5
I0424 09:25:49.388643 140063908292352 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 37.614 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 41.99935579299927 s) (per-host)
I0424 09:25:49.388767 140063908292352 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.941639s.
I0424 09:25:57.324144 140063908292352 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 49.877002s.
I0424 09:25:57.324453 140062797149952 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0424 09:25:57.324573 140062797149952 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0424 09:25:57.324615 140062797149952 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=19] CheckpointManager Save Finalize is syncing with other hosts...
I0424 09:25:57.326391 140062797149952 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=19] CheckpointManager Save Finalize is done on all hosts.
I0424 09:25:57.326559 140196258953024 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=19][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=19.
I0424 09:25:57.326727 140196258953024 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 09:25:57.327614 140196258953024 metric_logger.py:196] completed step: 19, seconds: 0.158, TFLOP/s/device: 85.801, Tokens/s/device: 12932.804, total_weights: 65536, loss: 6.167, lm_loss: 6.167, perplexity: 476.947
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Fri Apr 24 09:26:06 UTC 2026
EXIT_CODE=0