2026-04-17 13:32:11.538804: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0417 13:32:11.655614 130019267755136 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag. I0417 13:32:59.849675 130019267755136 max_utils.py:800] System Information: Jax Version: 0.9.2 I0417 13:32:59.849795 130019267755136 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0417 13:32:59.849830 130019267755136 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894 I0417 13:32:59.849854 130019267755136 train_utils.py:335] WARNING: 'dataset_path' might be pointing your local file system I0417 13:32:59.849876 130019267755136 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing. I0417 13:32:59.849955 130019267755136 train.py:671] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper. W0417 13:32:59.941447 472559 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0417 13:33:00.343456 130019267755136 maxtext_utils.py:1548] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0417 13:33:00.343751 130019267755136 checkpointing.py:677] Setting up checkpoint logger... I0417 13:33:00.343798 130019267755136 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True I0417 13:33:00.343845 130019267755136 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None I0417 13:33:00.344440 130019267755136 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x763fe15f8bc0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0417 13:33:02.732524 130019267755136 checkpointing.py:265] Enabling policy for fixed interval checkpointing. I0417 13:33:02.732713 130019267755136 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x763adc89c0e0>}, handler_registry=None I0417 13:33:02.732986 130019267755136 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x763adc89c0e0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0417 13:33:02.733026 130019267755136 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7639653326f0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0417 13:33:02.733068 130019267755136 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x763adc89c0e0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x763adc89c0e0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7639653326f0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7639653326f0>}). I0417 13:33:02.733363 130019267755136 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28 I0417 13:33:02.733414 130019267755136 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x76396532a160> timeout: 600 secs and primary_host=0 for async checkpoint writes I0417 13:33:02.863822 130019267755136 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints I0417 13:33:02.864067 130019267755136 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7639653304d0> I0417 13:33:02.864150 130019267755136 checkpointing.py:301] Checkpoint manager created! W0417 13:33:02.880832 472559 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0417 13:33:03.416899 130019267755136 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0417 13:33:03.416989 130019267755136 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0417 13:33:03.501309 130019267755136 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0417 13:33:03.501386 130019267755136 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0417 13:33:03.515386 130019267755136 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0417 13:33:03.515435 130019267755136 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0417 13:33:03.542500 130019267755136 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0417 13:33:03.542555 130019267755136 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0417 13:33:03.556711 130019267755136 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0417 13:33:03.556758 130019267755136 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0417 13:33:03.570759 130019267755136 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0417 13:33:03.570809 130019267755136 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0417 13:33:03.597530 130019267755136 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0417 13:33:03.597592 130019267755136 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0417 13:33:03.616583 130019267755136 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0417 13:33:03.616633 130019267755136 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None). I0417 13:33:06.146768 130019267755136 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint I0417 13:33:06.146870 130019267755136 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint. W0417 13:33:06.300153 472559 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. [DECOUPLED NO-OP] gcs_storage: using stubs. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] workload_monitor: using stub. [DECOUPLED NO-OP] vertex_tensorboard: using stub. fsdp: 8 I0417 13:33:09.123149 130019267755136 maxtext_utils.py:1651] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123271 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.123313 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.123356 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.123385 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123410 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123448 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.123486 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.123517 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.123543 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_0/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.123568 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.123591 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.123615 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.123637 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123666 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123690 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.123713 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.123737 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.123761 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_1/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.123784 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.123807 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.123829 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.123852 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123873 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.123897 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.123921 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.123943 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.123965 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_10/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.123989 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124011 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124033 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.124071 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124094 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124115 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124137 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.124158 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.124180 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_11/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124206 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124229 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124251 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.124272 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124292 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124313 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124335 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.124357 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.124378 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_12/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124400 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124421 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124443 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.124463 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124484 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124506 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124527 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.124548 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.124570 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_13/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124591 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124613 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124634 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.124658 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124681 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124703 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124726 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.124749 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.124771 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_14/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124793 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124814 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.124836 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.124857 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124877 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.124899 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124920 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.124942 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.124964 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_15/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.124986 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125008 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125031 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.125063 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125086 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125109 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125132 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.125154 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.125177 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_2/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125198 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125220 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125241 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.125262 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125282 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125304 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125326 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.125347 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.125369 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_3/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125391 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125413 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125434 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.125454 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125474 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125495 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125516 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.125538 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.125559 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_4/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125580 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125601 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125622 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.125642 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125665 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125686 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125708 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.125729 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.125751 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_5/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125772 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125792 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125813 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.125833 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125853 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.125874 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125896 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.125917 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.125938 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_6/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.125959 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.125981 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.126003 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.126023 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.126055 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.126079 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.126101 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.126124 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.126146 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_7/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.126167 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.126189 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.126210 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.126231 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.126251 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.126273 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.126294 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.126315 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.126336 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_8/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.126357 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.126378 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0417 13:33:09.126399 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0417 13:33:09.126419 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.126439 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0417 13:33:09.126460 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.126481 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0417 13:33:09.126502 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0417 13:33:09.126524 130019267755136 maxtext_utils.py:1651] params/params/decoder/layers_9/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0417 13:33:09.126559 130019267755136 maxtext_utils.py:1651] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: P('embed_vocab', 'vocab') Physical: ('fsdp', None) I0417 13:33:09.126591 130019267755136 maxtext_utils.py:1651] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: P('vocab', 'embed_vocab') Physical: (None, 'fsdp') I0417 13:33:13.196907 130019267755136 train.py:155] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length'). I0417 13:33:13.197067 130019267755136 train.py:155] train/xent Physical: float32[8,2048]............................................. ('fsdp', None). I0417 13:33:13.210634 130019267755136 train.py:162] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length'). I0417 13:33:13.210686 130019267755136 train.py:162] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None). I0417 13:34:07.314241 130019267755136 max_utils.py:791] Total memory size: 3.3 GB, Output size: 1.5 GB, Temp size: 1.8 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB. I0417 13:34:07.315203 130019267755136 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter. I0417 13:34:07.315794 130019267755136 metric_logger.py:301] number parameters: 1.104 billion I0417 13:35:06.266313 130019267755136 checkpointing.py:772] Waiting for step 0 to finish before checkpoint... I0417 13:35:06.370979 130019267755136 checkpointing.py:776] Waited 0.10466432571411133 seconds for step 0 to finish before starting checkpointing. I0417 13:35:06.372134 130019267755136 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0417 13:35:06.372333 130019267755136 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0 I0417 13:35:06.372807 130019267755136 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0. I0417 13:35:06.471399 130019267755136 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient I0417 13:35:06.559335 129911660480064 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0 I0417 13:35:06.608743 130019267755136 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array. I0417 13:35:06.608842 130019267755136 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0417 13:35:07.281185 130019267755136 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.673216s I0417 13:35:07.281673 130019267755136 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 15.259 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 808 milliseconds) (per-host) I0417 13:35:07.281742 130019267755136 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.808900s (batch_requests_ready=0.106266s, total_serialization_initiated=0.702302s, others=0.000332s) I0417 13:35:07.281881 130019267755136 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.810002s (all_items=0.000029s, per_item={'items': '0.00002909'}, temp_paths=0.809973) I0417 13:35:07.282968 129911593371200 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0417 13:35:07.283130 130019267755136 async_checkpointer.py:561] Finished blocking save. Time taken: 0.910748s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0. I0417 13:35:07.283417 130019267755136 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0417 13:35:07.283671 130019267755136 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776432906.372118, 'wait_for_prev_duration_secs': 4.7206878662109375e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776432906.3723552, 'checkpointer_blocking_duration_secs': 0.9109361171722412, 'get_old_steps_start_time': 1776432907.2833157, 'get_old_steps_duration_secs': 5.054473876953125e-05, 'checkpoint_manager_blocking_start_time': 1776432906.3719978, 'checkpoint_manager_blocking_duration_secs': 0.9116430282592773} I0417 13:35:07.284003 130019267755136 checkpointing.py:408] Started an asynchronous checkpoint save for step 0 I0417 13:35:07.284102 130019267755136 max_utils.py:750] Memstats: After params initialized: I0417 13:35:07.284515 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_0(process=0,(0,0,0,0)) I0417 13:35:07.284551 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_1(process=0,(1,0,0,0)) I0417 13:35:07.284572 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_2(process=0,(0,1,0,0)) I0417 13:35:07.284591 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_3(process=0,(1,1,0,0)) I0417 13:35:07.284608 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_4(process=0,(0,2,0,0)) I0417 13:35:07.284624 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_5(process=0,(1,2,0,0)) I0417 13:35:07.284641 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_6(process=0,(0,3,0,0)) I0417 13:35:07.284658 130019267755136 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_7(process=0,(1,3,0,0)) W0417 13:35:07.293218 472559 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0417 13:35:07.301523 129911649994304 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/items W0417 13:35:07.301920 472559 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0417 13:35:07.309701 472559 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0417 13:35:07.457996 129911614342720 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776432907203448100, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776432907203448100, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA I0417 13:35:07.458339 129911670965824 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0417 13:35:07.797869 130019267755136 metric_logger.py:196] completed step: 0, seconds: 58.950, TFLOP/s/device: 0.230, Tokens/s/device: 34.741, total_weights: 16384, loss: 10.887, lm_loss: 10.887, perplexity: 53501.250 I0417 13:35:07.799017 130019267755136 metric_logger.py:281] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/tensorboard/' I0417 13:35:07.876822 130019267755136 metric_logger.py:196] completed step: 1, seconds: 1.506, TFLOP/s/device: 9.022, Tokens/s/device: 1359.954, total_weights: 16384, loss: 10.887, lm_loss: 10.887, perplexity: 53501.250 I0417 13:35:07.987245 130019267755136 metric_logger.py:196] completed step: 2, seconds: 0.037, TFLOP/s/device: 370.312, Tokens/s/device: 55817.503, total_weights: 16384, loss: 9.787, lm_loss: 9.787, perplexity: 17806.746 I0417 13:35:08.036451 476504 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json I0417 13:35:08.036508 476504 google_auth_provider.cc:156] Using OAuth2 AuthProvider I0417 13:35:08.134344 130019267755136 metric_logger.py:196] completed step: 3, seconds: 0.081, TFLOP/s/device: 167.941, Tokens/s/device: 25313.952, total_weights: 16384, loss: 8.853, lm_loss: 8.853, perplexity: 6997.778 I0417 13:35:08.867841 129911639508544 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0 I0417 13:35:27.982933 129911603856960 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 19.974661s (commit=19.512836s, array_metadata_write=0.461825s) I0417 13:35:27.985809 129911593371200 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 587.407 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 21 seconds) (per-host) I0417 13:35:27.986060 129911593371200 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 20.702846s. I0417 13:35:28.222080 129911593371200 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776432907203448100, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA I0417 13:35:28.423246 129911593371200 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0417 13:35:28.618012 129911614342720 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776432907203448100, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA I0417 13:35:28.865330 129911593371200 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0417 13:35:28.867560 129911593371200 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.603577s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/items I0417 13:35:28.868577 129911593371200 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0/items I0417 13:35:29.106468 129911593371200 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0 I0417 13:35:29.787010 129911593371200 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0`. I0417 13:35:29.787897 129911593371200 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 23.415525s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/0 I0417 13:35:29.787989 129911593371200 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 22.504800s. I0417 13:35:29.788274 129911670965824 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0417 13:35:29.788434 129911670965824 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0417 13:35:29.788527 129911670965824 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0417 13:35:29.788591 129911670965824 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0417 13:35:30.864410 130019267755136 metric_logger.py:196] completed step: 4, seconds: 0.111, TFLOP/s/device: 122.063, Tokens/s/device: 18398.735, total_weights: 16384, loss: 7.982, lm_loss: 7.982, perplexity: 2927.066 I0417 13:35:30.881348 130019267755136 metric_logger.py:196] completed step: 5, seconds: 0.163, TFLOP/s/device: 83.229, Tokens/s/device: 12545.253, total_weights: 16384, loss: 7.248, lm_loss: 7.248, perplexity: 1405.074 I0417 13:35:30.981353 130019267755136 metric_logger.py:196] completed step: 6, seconds: 22.711, TFLOP/s/device: 0.598, Tokens/s/device: 90.176, total_weights: 16384, loss: 6.690, lm_loss: 6.690, perplexity: 804.397 I0417 13:35:31.091445 130019267755136 metric_logger.py:196] completed step: 7, seconds: 0.014, TFLOP/s/device: 992.486, Tokens/s/device: 149598.247, total_weights: 16384, loss: 6.307, lm_loss: 6.307, perplexity: 548.399 I0417 13:35:31.201163 130019267755136 metric_logger.py:196] completed step: 8, seconds: 0.102, TFLOP/s/device: 133.615, Tokens/s/device: 20139.838, total_weights: 16384, loss: 6.070, lm_loss: 6.070, perplexity: 432.506 I0417 13:35:31.310213 130019267755136 checkpointing.py:772] Waiting for step 9 to finish before checkpoint... I0417 13:35:31.316915 130019267755136 checkpointing.py:776] Waited 0.006720066070556641 seconds for step 9 to finish before starting checkpointing. I0417 13:35:31.317690 130019267755136 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0417 13:35:31.317859 130019267755136 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9 I0417 13:35:31.318208 130019267755136 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9. I0417 13:35:31.484149 129910213445184 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9 I0417 13:35:31.539298 130019267755136 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array. I0417 13:35:31.539824 130019267755136 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0417 13:35:31.773780 130019267755136 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.235264s I0417 13:35:31.774213 130019267755136 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 33.907 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 363 milliseconds) (per-host) I0417 13:35:31.774273 130019267755136 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.364049s (batch_requests_ready=0.100018s, total_serialization_initiated=0.263689s, others=0.000342s) I0417 13:35:31.774383 130019267755136 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.364999s (all_items=0.000022s, per_item={'items': '0.00002241'}, temp_paths=0.364976) I0417 13:35:31.775163 129911603856960 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0417 13:35:31.775264 130019267755136 async_checkpointer.py:561] Finished blocking save. Time taken: 0.457364s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9. I0417 13:35:31.775434 130019267755136 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0417 13:35:31.775616 129911593371200 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0417 13:35:31.775742 130019267755136 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776432931.3176596, 'wait_for_prev_duration_secs': 5.8650970458984375e-05, 'time_between_consecutive_saves_sec': 1.529052495956421, 'checkpointer_blocking_start_time': 1776432931.3178806, 'checkpointer_blocking_duration_secs': 0.45748281478881836, 'get_old_steps_start_time': 1776432931.7753775, 'get_old_steps_duration_secs': 2.1219253540039062e-05, 'checkpoint_manager_blocking_start_time': 1776432931.3176203, 'checkpoint_manager_blocking_duration_secs': 0.45809149742126465} I0417 13:35:31.775950 130019267755136 checkpointing.py:408] Started an asynchronous checkpoint save for step 9 I0417 13:35:31.775991 130019267755136 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0417 13:35:32.178526 129911262021184 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9/items I0417 13:35:33.440085 129911639508544 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0 I0417 13:36:01.956524 129911629022784 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 29.292188s (commit=28.823629s, array_metadata_write=0.468559s) I0417 13:36:01.958385 129911603856960 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 413.668 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 30 seconds) (per-host) I0417 13:36:01.958459 129911603856960 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 30.183144s. I0417 13:36:02.408735 129911603856960 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0417 13:36:02.897407 129911603856960 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0417 13:36:02.898867 129911603856960 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.658063s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9/items I0417 13:36:02.899514 129911603856960 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9/items I0417 13:36:03.152523 129911603856960 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9 I0417 13:36:03.871198 129911603856960 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9`. I0417 13:36:03.871850 129911603856960 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 32.553956s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260417_125240/linen_main_20260417_125240_13_scan_layers_false/checkpoints/9 I0417 13:36:03.871917 129911603856960 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 32.096607s. I0417 13:36:03.872035 129911593371200 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0417 13:36:03.872096 129911593371200 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0417 13:36:03.872159 129911593371200 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0417 13:36:03.872201 129911593371200 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0417 13:36:03.872332 130019267755136 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0417 13:36:03.872462 130019267755136 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0417 13:36:03.873312 130019267755136 metric_logger.py:196] completed step: 9, seconds: 0.110, TFLOP/s/device: 123.526, Tokens/s/device: 18619.197, total_weights: 16384, loss: 5.930, lm_loss: 5.930, perplexity: 376.179 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops