MaxView

‹ 04_int8Case: 05_fp806_grad_accum ›

Metrics: main (8a17c3d19) vs feat/nnx-post-train-fixes (7f06c99ac)

Metricmain  8a17c3d19feat/nnx-post-train-fixes  7f06c99acDiff (feat/nnx-post-train-fixes − main)
Parameters1.105 billion1.105 billion
Final loss6.16706.16700
TFLOP/s85.78385.862+0.079
Tok/s12930.112942.0+11.93
Avg s/step1.6791.638-0.041
Memory %1.441.440
JAX0.9.20.9.2

Diff = branch value − main value. Green = branch improved. Red = branch regressed.

main  ·  8a17c3d19  ·  main_20260422_071422  ·  full log
XPK Start: Wed Apr 22 07:26:18 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config. 
2026-04-22 07:26:43.851759: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0422 07:26:44.063495 135333372073792 max_utils.py:273] Attempting to initialize the jax distributed system...
I0422 07:26:53.104436 135333372073792 distributed.py:149] Starting JAX distributed service on [::]:8482
I0422 07:26:53.106833 135333372073792 distributed.py:172] Connecting to JAX distributed service on mt-05-fp8-68gsk-slice-job-0-0.mt-05-fp8-68gsk:8482
I0422 07:26:55.272066 135333372073792 max_utils.py:284] Jax distributed system initialized!
I0422 07:27:01.547978 135333372073792 max_utils.py:800] System Information: Jax Version: 0.9.2
I0422 07:27:01.548087 135333372073792 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0422 07:27:01.548127 135333372073792 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0422 07:27:01.548163 135333372073792 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0422 07:27:02.240637 135333372073792 maxtext_utils.py:1604] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0422 07:27:02.240922 135333372073792 checkpointing.py:677] Setting up checkpoint logger...
I0422 07:27:02.240976 135333372073792 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0422 07:27:02.241021 135333372073792 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0422 07:27:02.241388 135333372073792 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7b1509d50620>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0422 07:27:05.239210 135333372073792 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0422 07:27:05.239479 135333372073792 checkpoint_manager.py:708] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b003437aa50>}, handler_registry=None
I0422 07:27:05.239731 135333372073792 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b003437aa50>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0422 07:27:05.239784 135333372073792 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7b0108284f80>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0422 07:27:05.239820 135333372073792 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b003437aa50>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7b003437aa50>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7b0108284f80>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7b0108284f80>}).
I0422 07:27:05.240174 135333372073792 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0422 07:27:05.240271 135333372073792 async_checkpointer.py:192] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7b000858d6c0> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0422 07:27:06.368745 135333372073792 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints
I0422 07:27:07.222933 135333372073792 checkpoint_manager.py:929] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7b0108283050>
I0422 07:27:07.223107 135333372073792 checkpointing.py:301] Checkpoint manager created!
I0422 07:27:08.304450 135333372073792 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0422 07:27:08.304571 135333372073792 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 07:27:08.697253 135333372073792 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 07:27:08.697368 135333372073792 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 07:27:08.713508 135333372073792 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 07:27:08.713570 135333372073792 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 07:27:08.765725 135333372073792 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 07:27:08.765810 135333372073792 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:27:08.782114 135333372073792 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 07:27:08.782179 135333372073792 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:27:08.798487 135333372073792 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 07:27:08.798551 135333372073792 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:27:08.823938 135333372073792 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0422 07:27:08.824009 135333372073792 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 07:27:08.875435 135333372073792 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0422 07:27:08.875514 135333372073792 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0422 07:27:09.312471 135333372073792 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0422 07:27:09.312587 135333372073792 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0422 07:27:11.251715 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.251842 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.251891 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.251931 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.251965 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.251998 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252030 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252062 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252092 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252130 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252161 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252191 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252223 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252254 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252303 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252342 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252376 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252406 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252435 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252463 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252491 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252518 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252545 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252573 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252602 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252629 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252656 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252703 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252734 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252761 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252788 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252816 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252843 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252869 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252896 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252922 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252948 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.252974 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.253001 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.253028 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.253055 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.253083 135333372073792 maxtext_utils.py:1707]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 07:27:11.253140 135333372073792 maxtext_utils.py:1707]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 07:27:11.253196 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/mlp/wi_0/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0422 07:27:11.253235 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/mlp/wi_1/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0422 07:27:11.253293 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/mlp/wo/kernel
    Shape:     float32[7168,16,2048]
    Logical:   P('mlp', 'layers', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 07:27:11.253343 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/post_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0422 07:27:11.253378 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/pre_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0422 07:27:11.253426 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/key/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0422 07:27:11.253473 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/out/kernel
    Shape:     float32[16,16,128,2048]
    Logical:   P('heads', 'layers', 'kv', 'embed')
    Physical:  (None, None, None, 'fsdp')
I0422 07:27:11.253508 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/query/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None, None)
I0422 07:27:11.253541 135333372073792 maxtext_utils.py:1707]  params/params/decoder/layers/self_attention/value/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0422 07:27:11.253585 135333372073792 maxtext_utils.py:1707]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0422 07:27:11.253628 135333372073792 maxtext_utils.py:1707]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0422 07:27:12.892688 135333372073792 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 07:27:12.892782 135333372073792 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 07:27:12.908140 135333372073792 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 07:27:12.908202 135333372073792 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 07:27:26.961371 135333372073792 max_utils.py:791] Total memory size: 1.5 GB, Output size: 0.4 GB, Temp size: 1.1 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0422 07:27:26.962322 135333372073792 metric_logger.py:301] number parameters: 1.105 billion
I0422 07:27:42.335369 135333372073792 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0422 07:27:42.502927 135333372073792 checkpointing.py:776] Waited 0.16753888130187988 seconds for step 0 to finish before starting checkpointing.
I0422 07:27:42.505397 135333372073792 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 07:27:42.507488 135333372073792 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 0
I0422 07:27:42.508924 135333372073792 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/0.
I0422 07:27:43.268095 135333372073792 signaling_client.py:364] Using JaxDistributedSignalingClient
I0422 07:27:43.269100 135333372073792 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0422 07:27:43.269168 135333372073792 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 07:27:43.695445 135333372073792 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.427470s
I0422 07:27:43.695613 135333372073792 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 3.538 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.4360814094543457 s) (per-host)
I0422 07:27:43.695667 135333372073792 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.436145s (batch_requests_ready=0.003845s, total_serialization_initiated=0.432228s, others=0.000071s)
I0422 07:27:43.695761 135333372073792 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.440322s (all_items=0.000018s, per_item={'items': '0.00001764'}, temp_paths=0.440304)
I0422 07:27:43.696556 135333372073792 event_tracking.py:125] [process=6] [async] Finished blocking save in 1.19 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/0.
I0422 07:27:43.696902 135204415121152 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 07:47:43.696863
I0422 07:27:43.698942 135333372073792 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 07:27:43.699244 135204386715392 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0422 07:27:43.699424 135333372073792 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776842862.50538, 'wait_for_prev_duration_secs': 6.0558319091796875e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776842862.5075285, 'checkpointer_blocking_duration_secs': 1.18953537940979, 'get_old_steps_start_time': 1776842863.6970887, 'get_old_steps_duration_secs': 3.218650817871094e-05, 'checkpoint_manager_blocking_start_time': 1776842862.5035858, 'checkpoint_manager_blocking_duration_secs': 1.1957981586456299}
I0422 07:27:43.699532 135333372073792 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0422 07:27:43.699585 135333372073792 max_utils.py:750] 
Memstats: After params initialized:
I0422 07:27:43.699634 135333372073792 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_24(process=6,(0,6,0,0))
I0422 07:27:43.699670 135333372073792 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_25(process=6,(1,6,0,0))
I0422 07:27:43.699698 135333372073792 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_28(process=6,(0,7,0,0))
I0422 07:27:43.699724 135333372073792 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_29(process=6,(1,7,0,0))
I0422 07:27:44.018577 135333372073792 metric_logger.py:196] completed step: 0, seconds: 15.373, TFLOP/s/device: 0.884, Tokens/s/device: 133.221, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52776.805
I0422 07:27:44.212087 135333372073792 metric_logger.py:196] completed step: 1, seconds: 1.681, TFLOP/s/device: 8.082, Tokens/s/device: 1218.225, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52761.840
I0422 07:27:44.645320 135333372073792 metric_logger.py:196] completed step: 2, seconds: 0.036, TFLOP/s/device: 377.263, Tokens/s/device: 56865.195, total_weights: 65536, loss: 10.816, lm_loss: 10.816, perplexity: 49832.398
I0422 07:27:44.803481 135333372073792 metric_logger.py:196] completed step: 3, seconds: 0.433, TFLOP/s/device: 31.349, Tokens/s/device: 4725.329, total_weights: 65536, loss: 10.431, lm_loss: 10.431, perplexity: 33901.820
I0422 07:27:45.121147 135333372073792 metric_logger.py:196] completed step: 4, seconds: 0.164, TFLOP/s/device: 82.728, Tokens/s/device: 12469.633, total_weights: 65536, loss: 9.992, lm_loss: 9.992, perplexity: 21847.639
I0422 07:27:45.127038 135333372073792 metric_logger.py:196] completed step: 5, seconds: 0.158, TFLOP/s/device: 85.994, Tokens/s/device: 12962.025, total_weights: 65536, loss: 9.549, lm_loss: 9.549, perplexity: 14035.006
I0422 07:27:46.602230    2798 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0422 07:27:50.053838 135204395108096 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/0/items/array_metadatas/process_6
I0422 07:28:03.224945 135204415121152 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 79.125 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 19.965373039245605 s) (per-host)
I0422 07:28:03.225085 135204415121152 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 19.528069s.
I0422 07:28:09.831923 135333372073792 metric_logger.py:196] completed step: 6, seconds: 0.318, TFLOP/s/device: 42.771, Tokens/s/device: 6446.982, total_weights: 65536, loss: 9.111, lm_loss: 9.111, perplexity: 9058.688
I0422 07:28:09.990351 135333372073792 metric_logger.py:196] completed step: 7, seconds: 24.548, TFLOP/s/device: 0.553, Tokens/s/device: 83.429, total_weights: 65536, loss: 8.685, lm_loss: 8.685, perplexity: 5914.720
I0422 07:28:10.148869 135333372073792 metric_logger.py:196] completed step: 8, seconds: 0.163, TFLOP/s/device: 83.153, Tokens/s/device: 12533.660, total_weights: 65536, loss: 8.281, lm_loss: 8.281, perplexity: 3950.010
I0422 07:28:10.155040 135333372073792 checkpointing.py:772] Waiting for step 10 to finish before checkpoint...
I0422 07:28:10.465562 135333372073792 checkpointing.py:776] Waited 0.310497522354126 seconds for step 10 to finish before starting checkpointing.
I0422 07:28:10.468173 135333372073792 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 07:28:11.938017 135204415121152 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 28.240983s.
I0422 07:28:11.938325 135204386715392 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 07:28:11.938446 135204386715392 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0422 07:28:11.938498 135204386715392 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0422 07:28:11.940122 135204386715392 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0422 07:28:11.940251 135333372073792 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0422 07:28:11.940355 135333372073792 checkpoint_manager.py:1452] Waiting for previous save to complete took 1.472185 seconds. If this number is high, consider checkpointing less frequently.
I0422 07:28:11.942583 135333372073792 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 10
I0422 07:28:11.944607 135333372073792 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/10.
I0422 07:28:12.643826 135333372073792 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0422 07:28:12.643918 135333372073792 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 07:28:12.704690 135333372073792 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.061921s
I0422 07:28:12.704854 135333372073792 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 22.579 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.06832718849182129 s) (per-host)
I0422 07:28:12.704901 135333372073792 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.068384s (batch_requests_ready=0.003321s, total_serialization_initiated=0.064999s, others=0.000064s)
I0422 07:28:12.704998 135333372073792 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.072731s (all_items=0.000015s, per_item={'items': '0.00001526'}, temp_paths=0.072716)
I0422 07:28:12.705651 135333372073792 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.76 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/10.
I0422 07:28:12.706000 135204386715392 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 07:48:12.705962
I0422 07:28:12.708037 135333372073792 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=10] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 07:28:12.708356 135201200006912 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0422 07:28:12.708498 135333372073792 standard_logger.py:34] {'step': 10, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776842890.4681423, 'wait_for_prev_duration_secs': 1.4721853733062744, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776842891.942622, 'checkpointer_blocking_duration_secs': 0.7635326385498047, 'get_old_steps_start_time': 1776842892.706178, 'get_old_steps_duration_secs': 3.1948089599609375e-05, 'checkpoint_manager_blocking_start_time': 1776842890.4664054, 'checkpoint_manager_blocking_duration_secs': 2.242058277130127}
I0422 07:28:12.708675 135333372073792 checkpointing.py:408] Started an asynchronous checkpoint save for step 10
I0422 07:28:12.709346 135333372073792 metric_logger.py:196] completed step: 9, seconds: 0.158, TFLOP/s/device: 85.854, Tokens/s/device: 12940.894, total_weights: 65536, loss: 7.908, lm_loss: 7.908, perplexity: 2720.116
I0422 07:28:12.716763 135333372073792 metric_logger.py:196] completed step: 10, seconds: 0.160, TFLOP/s/device: 85.161, Tokens/s/device: 12836.423, total_weights: 65536, loss: 7.568, lm_loss: 7.568, perplexity: 1936.198
I0422 07:28:12.874706 135333372073792 metric_logger.py:196] completed step: 11, seconds: 2.559, TFLOP/s/device: 5.309, Tokens/s/device: 800.195, total_weights: 65536, loss: 7.288, lm_loss: 7.288, perplexity: 1463.272
I0422 07:28:13.440210 135333372073792 metric_logger.py:196] completed step: 12, seconds: 0.007, TFLOP/s/device: 2063.972, Tokens/s/device: 311104.360, total_weights: 65536, loss: 7.032, lm_loss: 7.032, perplexity: 1132.061
I0422 07:28:13.598575 135333372073792 metric_logger.py:196] completed step: 13, seconds: 0.561, TFLOP/s/device: 24.216, Tokens/s/device: 3650.084, total_weights: 65536, loss: 6.815, lm_loss: 6.815, perplexity: 911.031
I0422 07:28:13.756779 135333372073792 metric_logger.py:196] completed step: 14, seconds: 0.163, TFLOP/s/device: 83.238, Tokens/s/device: 12546.559, total_weights: 65536, loss: 6.635, lm_loss: 6.635, perplexity: 761.080
I0422 07:28:13.915349 135333372073792 metric_logger.py:196] completed step: 15, seconds: 0.158, TFLOP/s/device: 85.813, Tokens/s/device: 12934.682, total_weights: 65536, loss: 6.492, lm_loss: 6.492, perplexity: 659.915
I0422 07:28:14.073688 135333372073792 metric_logger.py:196] completed step: 16, seconds: 0.158, TFLOP/s/device: 85.888, Tokens/s/device: 12945.966, total_weights: 65536, loss: 6.380, lm_loss: 6.380, perplexity: 589.948
I0422 07:28:14.231964 135333372073792 metric_logger.py:196] completed step: 17, seconds: 0.159, TFLOP/s/device: 85.670, Tokens/s/device: 12913.070, total_weights: 65536, loss: 6.293, lm_loss: 6.293, perplexity: 540.941
I0422 07:28:14.390337 135333372073792 metric_logger.py:196] completed step: 18, seconds: 0.158, TFLOP/s/device: 85.869, Tokens/s/device: 12943.102, total_weights: 65536, loss: 6.223, lm_loss: 6.223, perplexity: 504.108
I0422 07:28:14.547967 135333372073792 checkpointing.py:772] Waiting for step 19 to finish before checkpoint...
I0422 07:28:14.548979 135333372073792 checkpointing.py:776] Waited 0.0010323524475097656 seconds for step 19 to finish before starting checkpointing.
I0422 07:28:14.551079 135333372073792 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=10][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 07:28:18.239221 135201174828800 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/10/items/array_metadatas/process_6
I0422 07:28:54.934067 135204386715392 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 37.349 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.2974808216095 s) (per-host)
I0422 07:28:54.934198 135204386715392 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.228076s.
I0422 07:29:02.679600 135204386715392 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 49.973461s.
I0422 07:29:02.679893 135201200006912 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 07:29:02.680017 135201200006912 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0422 07:29:02.680068 135201200006912 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=10] CheckpointManager Save Finalize is syncing with other hosts...
I0422 07:29:02.681487 135201200006912 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=10] CheckpointManager Save Finalize is done on all hosts.
I0422 07:29:02.681666 135333372073792 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=10][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=10.
W0422 07:29:02.681789 135333372073792 checkpoint_manager.py:1452] Waiting for previous save to complete took 48.130715 seconds. If this number is high, consider checkpointing less frequently.
I0422 07:29:02.684276 135333372073792 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 19
I0422 07:29:02.686276 135333372073792 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/19.
I0422 07:29:03.430472 135333372073792 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0422 07:29:03.430567 135333372073792 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 07:29:03.490147 135333372073792 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.060668s
I0422 07:29:03.490324 135333372073792 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 22.946 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.06723475456237793 s) (per-host)
I0422 07:29:03.490372 135333372073792 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.067294s (batch_requests_ready=0.003303s, total_serialization_initiated=0.063927s, others=0.000064s)
I0422 07:29:03.490470 135333372073792 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.071550s (all_items=0.000012s, per_item={'items': '0.00001168'}, temp_paths=0.071539)
I0422 07:29:03.491123 135333372073792 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.81 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/19.
I0422 07:29:03.491473 135201200006912 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 07:49:03.491433
I0422 07:29:03.493467 135333372073792 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=19] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 07:29:03.493692 135201174828800 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0422 07:29:03.493792 135333372073792 standard_logger.py:34] {'step': 19, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776842894.5510504, 'wait_for_prev_duration_secs': 48.13071537017822, 'time_between_consecutive_saves_sec': 2.610888957977295, 'checkpointer_blocking_start_time': 1776842942.6843297, 'checkpointer_blocking_duration_secs': 0.8072867393493652, 'get_old_steps_start_time': 1776842943.4916358, 'get_old_steps_duration_secs': 2.6226043701171875e-05, 'checkpoint_manager_blocking_start_time': 1776842894.5492687, 'checkpoint_manager_blocking_duration_secs': 48.94449162483215}
I0422 07:29:03.493960 135333372073792 checkpointing.py:408] Started an asynchronous checkpoint save for step 19
I0422 07:29:03.494005 135333372073792 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=19][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 07:29:08.636494 135201703307008 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260422_071422/linen_xpk_main_20260422_071422_05_fp8/checkpoints/19/items/array_metadatas/process_6
I0422 07:29:45.772080 135201200006912 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 37.303 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.348944425582886 s) (per-host)
I0422 07:29:45.772202 135201200006912 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.280615s.
I0422 07:29:52.540896 135201200006912 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 49.049291s.
I0422 07:29:52.541145 135201174828800 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 07:29:52.541203 135201174828800 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0422 07:29:52.541252 135201174828800 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=19] CheckpointManager Save Finalize is syncing with other hosts...
I0422 07:29:52.542946 135201174828800 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=19] CheckpointManager Save Finalize is done on all hosts.
I0422 07:29:52.543098 135333372073792 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=19][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=19.
I0422 07:29:52.543248 135333372073792 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 07:29:52.544183 135333372073792 metric_logger.py:196] completed step: 19, seconds: 0.158, TFLOP/s/device: 85.783, Tokens/s/device: 12930.109, total_weights: 65536, loss: 6.167, lm_loss: 6.167, perplexity: 476.947
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Wed Apr 22 07:30:03 UTC 2026
EXIT_CODE=0
XPK Start: Wed Apr 22 12:53:30 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-22 12:53:54.538789: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0422 12:53:55.078714 137987280103232 max_utils.py:273] Attempting to initialize the jax distributed system...
I0422 12:54:04.120253 137987280103232 distributed.py:149] Starting JAX distributed service on [::]:8482
I0422 12:54:04.122510 137987280103232 distributed.py:172] Connecting to JAX distributed service on mt-05-fp8-7xe7t-slice-job-0-0.mt-05-fp8-7xe7t:8482
I0422 12:54:05.987696 137987280103232 max_utils.py:284] Jax distributed system initialized!
I0422 12:54:12.033770 137987280103232 max_utils.py:800] System Information: Jax Version: 0.9.2
I0422 12:54:12.033871 137987280103232 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0422 12:54:12.033911 137987280103232 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0422 12:54:12.033945 137987280103232 train_utils.py:378] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0422 12:54:12.769585 137987280103232 maxtext_utils.py:1718] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0422 12:54:12.770182 137987280103232 maxtext_utils.py:1718] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0422 12:54:12.770370 137987280103232 checkpointing.py:688] Setting up checkpoint logger...
I0422 12:54:12.770421 137987280103232 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True
I0422 12:54:12.770466 137987280103232 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0422 12:54:12.770799 137987280103232 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7d7f0f51b110>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0422 12:54:15.632780 137987280103232 checkpointing.py:266] Enabling policy for fixed interval checkpointing.
I0422 12:54:15.633018 137987280103232 checkpoint_manager.py:708] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6ad4507e60>}, handler_registry=None
I0422 12:54:15.633267 137987280103232 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6ad4507e60>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0422 12:54:15.633317 137987280103232 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d6ad450caa0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0422 12:54:15.633353 137987280103232 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6ad4507e60>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7d6ad4507e60>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d6ad450caa0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7d6ad450caa0>}).
I0422 12:54:15.633671 137987280103232 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0422 12:54:15.633740 137987280103232 async_checkpointer.py:192] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7d6ad42c0180> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0422 12:54:17.080787 137987280103232 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints
I0422 12:54:17.121233 137987280103232 checkpoint_manager.py:929] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7d6ad44afe30>
I0422 12:54:17.121375 137987280103232 checkpointing.py:302] Checkpoint manager created!
I0422 12:54:18.219890 137987280103232 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0422 12:54:18.219999 137987280103232 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 12:54:18.604385 137987280103232 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 12:54:18.604479 137987280103232 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 12:54:18.620865 137987280103232 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0422 12:54:18.620924 137987280103232 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0422 12:54:18.673624 137987280103232 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 12:54:18.673712 137987280103232 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 12:54:18.690214 137987280103232 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 12:54:18.690276 137987280103232 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 12:54:18.706566 137987280103232 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0422 12:54:18.706624 137987280103232 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 12:54:18.731341 137987280103232 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0422 12:54:18.731410 137987280103232 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0422 12:54:18.783394 137987280103232 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0422 12:54:18.783472 137987280103232 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0422 12:54:19.235069 137987280103232 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint
I0422 12:54:19.235207 137987280103232 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0422 12:54:21.168351 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168471 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168518 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168558 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168595 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168629 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_0/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168660 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168693 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168724 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168754 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168784 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168814 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wi_1/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168843 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168871 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168899 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168929 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168957 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.168985 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/mlp/wo/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169012 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169039 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169066 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169119 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169152 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169181 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/key/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169209 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169236 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169269 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169296 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169324 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169351 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/out/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169378 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169405 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169433 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169461 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169487 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169514 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/query/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169540 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169566 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/input_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169607 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169658 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/kernel_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169724 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_amax_history
    Shape:     float32[16,1024]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169755 137987280103232 maxtext_utils.py:1827]  params/_overwrite_with_gradient/decoder/layers/self_attention/value/Fp8DirectDotGeneralOp_0/output_grad_scale
    Shape:     float32[16,1]
    Logical:   P()
    Physical:  ()
I0422 12:54:21.169809 137987280103232 maxtext_utils.py:1827]  params/params/decoder/decoder_norm/scale
    Shape:     float32[2048]
    Logical:   P('norm',)
    Physical:  (None,)
I0422 12:54:21.169865 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/mlp/wi_0/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0422 12:54:21.169902 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/mlp/wi_1/kernel
    Shape:     float32[2048,16,7168]
    Logical:   P('embed', 'layers', 'mlp')
    Physical:  ('fsdp', None, None)
I0422 12:54:21.169950 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/mlp/wo/kernel
    Shape:     float32[7168,16,2048]
    Logical:   P('mlp', 'layers', 'embed')
    Physical:  (None, None, 'fsdp')
I0422 12:54:21.169998 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/post_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0422 12:54:21.170032 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/pre_self_attention_layer_norm/scale
    Shape:     float32[2048,16]
    Logical:   P('norm', 'layers')
    Physical:  (None, None)
I0422 12:54:21.170079 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/self_attention/key/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0422 12:54:21.170151 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/self_attention/out/kernel
    Shape:     float32[16,16,128,2048]
    Logical:   P('heads', 'layers', 'kv', 'embed')
    Physical:  (None, None, None, 'fsdp')
I0422 12:54:21.170191 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/self_attention/query/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'q_heads', 'kv')
    Physical:  ('fsdp', None, None, None)
I0422 12:54:21.170224 137987280103232 maxtext_utils.py:1827]  params/params/decoder/layers/self_attention/value/kernel
    Shape:     float32[2048,16,16,128]
    Logical:   P('embed', 'layers', 'kv_heads', 'kv_head_dim')
    Physical:  ('fsdp', None, None, None)
I0422 12:54:21.170274 137987280103232 maxtext_utils.py:1827]  params/params/decoder/logits_dense/kernel
    Shape:     float32[2048,32000]
    Logical:   P('embed_vocab', 'vocab')
    Physical:  ('fsdp', None)
I0422 12:54:21.170319 137987280103232 maxtext_utils.py:1827]  params/params/token_embedder/embedding
    Shape:     float32[32000,2048]
    Logical:   P('vocab', 'embed_vocab')
    Physical:  (None, 'fsdp')

I0422 12:54:22.823265 137987280103232 train.py:157] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 12:54:22.823368 137987280103232 train.py:157] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 12:54:22.838819 137987280103232 train.py:164] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0422 12:54:22.838880 137987280103232 train.py:164] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0422 12:54:36.652349 137987280103232 max_utils.py:791] Total memory size: 1.5 GB, Output size: 0.4 GB, Temp size: 1.1 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0422 12:54:36.653247 137987280103232 metric_logger.py:301] number parameters: 1.105 billion
I0422 12:54:51.834130 137987280103232 checkpointing.py:794] Waiting for step 0 to finish before checkpoint...
I0422 12:54:52.003693 137987280103232 checkpointing.py:798] Waited 0.16956210136413574 seconds for step 0 to finish before starting checkpointing.
I0422 12:54:52.006145 137987280103232 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 12:54:52.008145 137987280103232 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 0
I0422 12:54:52.009631 137987280103232 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/0.
I0422 12:54:52.345177 137987280103232 signaling_client.py:364] Using JaxDistributedSignalingClient
I0422 12:54:52.346037 137987280103232 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0422 12:54:52.346198 137987280103232 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 12:54:52.757120 137987280103232 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.412067s
I0422 12:54:52.757300 137987280103232 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 3.668 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.42061281204223633 s) (per-host)
I0422 12:54:52.757355 137987280103232 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.420680s (batch_requests_ready=0.003667s, total_serialization_initiated=0.416934s, others=0.000079s)
I0422 12:54:52.757456 137987280103232 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.424710s (all_items=0.000018s, per_item={'items': '0.00001812'}, temp_paths=0.424691)
I0422 12:54:52.758341 137987280103232 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.75 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/0.
I0422 12:54:52.758659 137857689110272 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 13:14:52.758622
I0422 12:54:52.766982 137987280103232 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 12:54:52.767296 137856629597952 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0422 12:54:52.767452 137987280103232 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776862492.006127, 'wait_for_prev_duration_secs': 5.9604644775390625e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776862492.0081837, 'checkpointer_blocking_duration_secs': 0.7506284713745117, 'get_old_steps_start_time': 1776862492.7588367, 'get_old_steps_duration_secs': 3.170967102050781e-05, 'checkpoint_manager_blocking_start_time': 1776862492.0043292, 'checkpoint_manager_blocking_duration_secs': 0.7630834579467773}
I0422 12:54:52.767562 137987280103232 checkpointing.py:409] Started an asynchronous checkpoint save for step 0
I0422 12:54:52.767613 137987280103232 max_utils.py:750] 
Memstats: After params initialized:
I0422 12:54:52.767664 137987280103232 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_24(process=6,(0,6,0,0))
I0422 12:54:52.767697 137987280103232 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_25(process=6,(1,6,0,0))
I0422 12:54:52.767724 137987280103232 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_28(process=6,(0,7,0,0))
I0422 12:54:52.767748 137987280103232 max_utils.py:756] 	Using (GB) 0.45 / 31.25 (1.440000%) on TPU_29(process=6,(1,7,0,0))
I0422 12:54:53.083547 137987280103232 metric_logger.py:196] completed step: 0, seconds: 15.181, TFLOP/s/device: 0.895, Tokens/s/device: 134.908, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52776.805
I0422 12:54:53.278694 137987280103232 metric_logger.py:196] completed step: 1, seconds: 1.247, TFLOP/s/device: 10.893, Tokens/s/device: 1641.943, total_weights: 65536, loss: 10.874, lm_loss: 10.874, perplexity: 52761.840
I0422 12:54:53.688228 137987280103232 metric_logger.py:196] completed step: 2, seconds: 0.038, TFLOP/s/device: 359.953, Tokens/s/device: 54255.967, total_weights: 65536, loss: 10.816, lm_loss: 10.816, perplexity: 49832.398
I0422 12:54:53.846538 137987280103232 metric_logger.py:196] completed step: 3, seconds: 0.410, TFLOP/s/device: 33.157, Tokens/s/device: 4997.743, total_weights: 65536, loss: 10.431, lm_loss: 10.431, perplexity: 33901.820
I0422 12:54:54.164757 137987280103232 metric_logger.py:196] completed step: 4, seconds: 0.164, TFLOP/s/device: 82.769, Tokens/s/device: 12475.786, total_weights: 65536, loss: 9.992, lm_loss: 9.992, perplexity: 21847.639
I0422 12:54:54.171684 137987280103232 metric_logger.py:196] completed step: 5, seconds: 0.158, TFLOP/s/device: 85.902, Tokens/s/device: 12948.094, total_weights: 65536, loss: 9.549, lm_loss: 9.549, perplexity: 14035.006
I0422 12:54:55.672840    2881 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0422 12:54:58.722892 137856637990656 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/0/items/array_metadatas/process_6
I0422 12:55:12.660223 137857689110272 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 77.731 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 20.3234806060791 s) (per-host)
I0422 12:55:12.660352 137857689110272 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 19.901584s.
I0422 12:55:18.980677 137987280103232 metric_logger.py:196] completed step: 6, seconds: 0.319, TFLOP/s/device: 42.581, Tokens/s/device: 6418.252, total_weights: 65536, loss: 9.111, lm_loss: 9.111, perplexity: 9058.688
I0422 12:55:19.139016 137987280103232 metric_logger.py:196] completed step: 7, seconds: 24.652, TFLOP/s/device: 0.551, Tokens/s/device: 83.077, total_weights: 65536, loss: 8.685, lm_loss: 8.685, perplexity: 5914.720
I0422 12:55:19.297393 137987280103232 metric_logger.py:196] completed step: 8, seconds: 0.163, TFLOP/s/device: 83.131, Tokens/s/device: 12530.362, total_weights: 65536, loss: 8.281, lm_loss: 8.281, perplexity: 3950.010
I0422 12:55:19.302531 137987280103232 checkpointing.py:794] Waiting for step 10 to finish before checkpoint...
I0422 12:55:19.614700 137987280103232 checkpointing.py:798] Waited 0.3121376037597656 seconds for step 10 to finish before starting checkpointing.
I0422 12:55:19.617340 137987280103232 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 12:55:20.639999 137857689110272 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 27.881213s.
I0422 12:55:20.640340 137856629597952 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 12:55:20.640469 137856629597952 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0422 12:55:20.640521 137856629597952 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0422 12:55:20.643404 137856629597952 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0422 12:55:20.643570 137987280103232 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0422 12:55:20.643708 137987280103232 checkpoint_manager.py:1452] Waiting for previous save to complete took 1.026372 seconds. If this number is high, consider checkpointing less frequently.
I0422 12:55:20.645776 137987280103232 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 10
I0422 12:55:20.647836 137987280103232 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/10.
I0422 12:55:21.363008 137987280103232 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0422 12:55:21.363121 137987280103232 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 12:55:21.415584 137987280103232 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.053511s
I0422 12:55:21.415752 137987280103232 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 25.744 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.05992531776428223 s) (per-host)
I0422 12:55:21.415803 137987280103232 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.059988s (batch_requests_ready=0.003315s, total_serialization_initiated=0.056603s, others=0.000070s)
I0422 12:55:21.415906 137987280103232 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.064126s (all_items=0.000014s, per_item={'items': '0.00001431'}, temp_paths=0.064111)
I0422 12:55:21.416597 137987280103232 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.77 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/10.
I0422 12:55:21.416879 137856629597952 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 13:15:21.416855
I0422 12:55:21.418887 137987280103232 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=10] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 12:55:21.419123 137855020033792 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0422 12:55:21.419244 137987280103232 standard_logger.py:34] {'step': 10, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776862519.6173036, 'wait_for_prev_duration_secs': 1.0263721942901611, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776862520.6458168, 'checkpointer_blocking_duration_secs': 0.7711696624755859, 'get_old_steps_start_time': 1776862521.4170094, 'get_old_steps_duration_secs': 3.0994415283203125e-05, 'checkpoint_manager_blocking_start_time': 1776862519.615547, 'checkpoint_manager_blocking_duration_secs': 1.8036601543426514}
I0422 12:55:21.419355 137987280103232 checkpointing.py:409] Started an asynchronous checkpoint save for step 10
I0422 12:55:21.420008 137987280103232 metric_logger.py:196] completed step: 9, seconds: 0.158, TFLOP/s/device: 85.890, Tokens/s/device: 12946.293, total_weights: 65536, loss: 7.908, lm_loss: 7.908, perplexity: 2720.116
I0422 12:55:21.427485 137987280103232 metric_logger.py:196] completed step: 10, seconds: 0.158, TFLOP/s/device: 85.776, Tokens/s/device: 12929.048, total_weights: 65536, loss: 7.568, lm_loss: 7.568, perplexity: 1936.198
I0422 12:55:21.585313 137987280103232 metric_logger.py:196] completed step: 11, seconds: 2.123, TFLOP/s/device: 6.401, Tokens/s/device: 964.873, total_weights: 65536, loss: 7.288, lm_loss: 7.288, perplexity: 1463.272
I0422 12:55:21.743623 137987280103232 metric_logger.py:196] completed step: 12, seconds: 0.007, TFLOP/s/device: 2029.141, Tokens/s/device: 305854.241, total_weights: 65536, loss: 7.032, lm_loss: 7.032, perplexity: 1132.061
I0422 12:55:22.323735 137987280103232 metric_logger.py:196] completed step: 13, seconds: 0.159, TFLOP/s/device: 85.398, Tokens/s/device: 12872.084, total_weights: 65536, loss: 6.815, lm_loss: 6.815, perplexity: 911.031
I0422 12:55:22.482298 137987280103232 metric_logger.py:196] completed step: 14, seconds: 0.575, TFLOP/s/device: 23.640, Tokens/s/device: 3563.264, total_weights: 65536, loss: 6.635, lm_loss: 6.635, perplexity: 761.080
I0422 12:55:22.640824 137987280103232 metric_logger.py:196] completed step: 15, seconds: 0.164, TFLOP/s/device: 82.996, Tokens/s/device: 12510.079, total_weights: 65536, loss: 6.492, lm_loss: 6.492, perplexity: 659.915
I0422 12:55:22.799763 137987280103232 metric_logger.py:196] completed step: 16, seconds: 0.158, TFLOP/s/device: 85.952, Tokens/s/device: 12955.629, total_weights: 65536, loss: 6.380, lm_loss: 6.380, perplexity: 589.948
I0422 12:55:22.958034 137987280103232 metric_logger.py:196] completed step: 17, seconds: 0.159, TFLOP/s/device: 85.722, Tokens/s/device: 12920.891, total_weights: 65536, loss: 6.293, lm_loss: 6.293, perplexity: 540.941
I0422 12:55:23.116245 137987280103232 metric_logger.py:196] completed step: 18, seconds: 0.159, TFLOP/s/device: 85.471, Tokens/s/device: 12883.096, total_weights: 65536, loss: 6.223, lm_loss: 6.223, perplexity: 504.108
I0422 12:55:23.273782 137987280103232 checkpointing.py:794] Waiting for step 19 to finish before checkpoint...
I0422 12:55:23.274823 137987280103232 checkpointing.py:798] Waited 0.0010619163513183594 seconds for step 19 to finish before starting checkpointing.
I0422 12:55:23.276814 137987280103232 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=10][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 12:55:27.685343 137853381146368 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/10/items/array_metadatas/process_6
I0422 12:56:04.082796 137856629597952 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 36.973 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.726930379867554 s) (per-host)
I0422 12:56:04.082925 137856629597952 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 42.665962s.
I0422 12:56:13.390200 137856629597952 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 51.973221s.
I0422 12:56:13.390448 137855020033792 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 12:56:13.390508 137855020033792 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0422 12:56:13.390553 137855020033792 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=10] CheckpointManager Save Finalize is syncing with other hosts...
I0422 12:56:13.393018 137855020033792 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=10] CheckpointManager Save Finalize is done on all hosts.
I0422 12:56:13.393198 137987280103232 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=10][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=10.
W0422 12:56:13.393338 137987280103232 checkpoint_manager.py:1452] Waiting for previous save to complete took 50.116524 seconds. If this number is high, consider checkpointing less frequently.
I0422 12:56:13.394837 137987280103232 checkpoint_manager.py:1512] [process=6] Saving checkpoint at step 19
I0422 12:56:13.396958 137987280103232 event_tracking.py:70] [process=6] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/19.
I0422 12:56:13.684927 137987280103232 jax_array_handlers.py:360] Scheduling D2H of 81 prioritized jax.Array.
I0422 12:56:13.685021 137987280103232 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0422 12:56:13.737471 137987280103232 base_pytree_checkpoint_handler.py:154] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.053460s
I0422 12:56:13.737627 137987280103232 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/blocking_gbytes_per_sec: 25.744 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.059926509857177734 s) (per-host)
I0422 12:56:13.737676 137987280103232 base_pytree_checkpoint_handler.py:768] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.059987s (batch_requests_ready=0.003289s, total_serialization_initiated=0.056633s, others=0.000065s)
I0422 12:56:13.737772 137987280103232 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.064096s (all_items=0.000010s, per_item={'items': '0.00001049'}, temp_paths=0.064085)
I0422 12:56:13.738508 137987280103232 event_tracking.py:125] [process=6] [async] Finished blocking save in 0.34 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/19.
I0422 12:56:13.738766 137855020033792 async_checkpointer.py:76] [process=6][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-22 13:16:13.738742
I0422 12:56:13.740797 137987280103232 checkpoint_manager.py:1560] [process=6][thread=MainThread][step=19] Starting CheckpointManager Save Finalize thread=save_finalize
I0422 12:56:13.741002 137854449592064 async_checkpointer.py:280] [process=6][thread=save_finalize] Waiting for background save thread=async_save.
I0422 12:56:13.741122 137987280103232 standard_logger.py:34] {'step': 19, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776862523.2767854, 'wait_for_prev_duration_secs': 50.1165235042572, 'time_between_consecutive_saves_sec': 2.6333391666412354, 'checkpointer_blocking_start_time': 1776862573.3948753, 'checkpointer_blocking_duration_secs': 0.3439610004425049, 'get_old_steps_start_time': 1776862573.7388558, 'get_old_steps_duration_secs': 2.6226043701171875e-05, 'checkpoint_manager_blocking_start_time': 1776862523.2751145, 'checkpoint_manager_blocking_duration_secs': 50.46596622467041}
I0422 12:56:13.741231 137987280103232 checkpointing.py:409] Started an asynchronous checkpoint save for step 19
I0422 12:56:13.741272 137987280103232 checkpoint_manager.py:2020] [process=6][thread=MainThread][step=19][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0422 12:56:19.048748 137853381146368 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 81 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_post_train_fixes_20260422_123906/linen_xpk_feat_nnx_post_train_fixes_20260422_123906_05_fp8/checkpoints/19/items/array_metadatas/process_6
I0422 12:56:55.556533 137855020033792 base_pytree_checkpoint_handler.py:130] [process=6] /jax/orbax/write/gbytes_per_sec: 37.722 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 41.878793478012085 s) (per-host)
I0422 12:56:55.556659 137855020033792 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.817230s.
I0422 12:57:07.094940 137855020033792 async_checkpointer.py:160] [process=6][thread=async_save] Background save thread done. Time taken: 53.355495s.
I0422 12:57:07.095203 137854449592064 async_checkpointer.py:288] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save.
I0422 12:57:07.095259 137854449592064 async_checkpointer.py:298] [process=6][thread=save_finalize] No errors found in background save thread=async_save.
I0422 12:57:07.095304 137854449592064 checkpoint_manager.py:2137] [process=6][thread=save_finalize][step=19] CheckpointManager Save Finalize is syncing with other hosts...
I0422 12:57:07.097311 137854449592064 checkpoint_manager.py:2146] [process=6][thread=save_finalize][step=19] CheckpointManager Save Finalize is done on all hosts.
I0422 12:57:07.097469 137987280103232 checkpoint_manager.py:2032] [process=6][thread=MainThread][step=19][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=19.
I0422 12:57:07.097622 137987280103232 checkpoint_manager.py:2009] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0422 12:57:07.098673 137987280103232 metric_logger.py:196] completed step: 19, seconds: 0.158, TFLOP/s/device: 85.862, Tokens/s/device: 12942.039, total_weights: 65536, loss: 6.167, lm_loss: 6.167, perplexity: 476.947
Per train step:
 Total TFLOPs: 13.59 
 split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Wed Apr 22 12:57:15 UTC 2026
EXIT_CODE=0