2026-04-15 23:49:46.564170: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0415 23:49:46.678832 124871028235392 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag. I0415 23:50:41.968300 124871028235392 max_utils.py:800] System Information: Jax Version: 0.9.2 I0415 23:50:41.968406 124871028235392 max_utils.py:801] System Information: Jaxlib Version: 0.9.2 I0415 23:50:41.968440 124871028235392 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894 I0415 23:50:41.968464 124871028235392 train_utils.py:334] WARNING: 'dataset_path' might be pointing your local file system I0415 23:50:41.968486 124871028235392 train_utils.py:347] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing. I0415 23:50:41.968566 124871028235392 train.py:715] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper. W0415 23:50:42.059897 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0415 23:50:42.466391 124871028235392 maxtext_utils.py:1520] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0415 23:50:42.466717 124871028235392 checkpointing.py:677] Setting up checkpoint logger... I0415 23:50:42.466761 124871028235392 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True I0415 23:50:42.466802 124871028235392 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None I0415 23:50:42.467357 124871028235392 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7191379fc710>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0415 23:50:44.912661 124871028235392 checkpointing.py:265] Enabling policy for fixed interval checkpointing. I0415 23:50:44.912832 124871028235392 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x718b8a7c43b0>}, handler_registry=None I0415 23:50:44.913116 124871028235392 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x718b8a7c43b0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0415 23:50:44.913158 124871028235392 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x718aba3325a0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0415 23:50:44.913189 124871028235392 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x718b8a7c43b0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x718b8a7c43b0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x718aba3325a0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x718aba3325a0>}). I0415 23:50:44.913473 124871028235392 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28 I0415 23:50:44.913522 124871028235392 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x718aba3354e0> timeout: 600 secs and primary_host=0 for async checkpoint writes I0415 23:50:45.049666 124871028235392 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints I0415 23:50:45.049882 124871028235392 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x718aba332450> I0415 23:50:45.049968 124871028235392 checkpointing.py:301] Checkpoint manager created! W0415 23:50:45.066025 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0415 23:50:45.265405 124871028235392 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0415 23:50:45.265585 124871028235392 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0415 23:50:45.362487 124871028235392 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0415 23:50:45.362563 124871028235392 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0415 23:50:45.376421 124871028235392 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0415 23:50:45.376470 124871028235392 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None). I0415 23:50:45.403290 124871028235392 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0415 23:50:45.403345 124871028235392 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0415 23:50:45.417315 124871028235392 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0415 23:50:45.417365 124871028235392 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0415 23:50:45.431300 124871028235392 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0415 23:50:45.431350 124871028235392 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0415 23:50:45.457779 124871028235392 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0415 23:50:45.457834 124871028235392 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None). I0415 23:50:45.476758 124871028235392 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0415 23:50:45.476811 124871028235392 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None). I0415 23:50:47.983772 124871028235392 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint I0415 23:50:47.983903 124871028235392 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint. W0415 23:50:48.522696 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. [DECOUPLED NO-OP] gcs_storage: using stubs. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] mldiagnostics: using stub. [DECOUPLED NO-OP] workload_monitor: using stub. [DECOUPLED NO-OP] vertex_tensorboard: using stub. fsdp: 8 I0415 23:50:51.306123 124871028235392 maxtext_utils.py:1623] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306242 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306283 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306326 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.306354 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306380 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306417 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.306455 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.306483 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.306509 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_0/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.306533 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306557 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306580 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.306603 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306625 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306648 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.306677 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.306700 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.306726 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_1/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.306750 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306774 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306797 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.306820 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306841 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.306864 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.306886 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.306909 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.306931 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_10/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.306952 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306973 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.306995 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.307016 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307036 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307072 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307095 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.307118 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.307139 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_11/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307161 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307182 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307204 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.307224 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307245 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307267 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307289 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.307311 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.307332 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_12/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307354 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307375 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307396 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.307415 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307435 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307457 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307478 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.307500 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.307524 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_13/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307546 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307567 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307588 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.307608 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307627 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307648 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307670 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.307695 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.307717 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_14/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307738 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307759 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307780 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.307800 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307820 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.307842 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307863 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.307885 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.307906 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_15/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.307927 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307949 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.307971 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.307991 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308011 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308033 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308068 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.308091 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.308114 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_2/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308135 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308156 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308176 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.308195 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308215 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308235 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308257 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.308278 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.308299 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_3/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308322 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308343 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308366 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.308386 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308406 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308429 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308452 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.308474 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.308495 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_4/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308517 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308538 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308559 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.308578 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308598 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308621 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308642 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.308664 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.308688 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_5/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308709 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308730 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308751 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.308771 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308790 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308812 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308833 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.308854 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.308875 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_6/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.308896 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308917 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.308938 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.308959 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.308980 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.309002 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.309023 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.309055 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.309077 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_7/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.309099 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.309120 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.309141 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.309160 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.309180 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.309201 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.309223 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.309244 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.309264 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_8/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.309285 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.309306 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: P('embed', 'mlp') Physical: ('fsdp', None) I0415 23:50:51.309326 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/mlp/wo/kernel Shape: float32[7168,2048] Logical: P('mlp', 'embed') Physical: (None, 'fsdp') I0415 23:50:51.309345 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.309365 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: P('norm',) Physical: (None,) I0415 23:50:51.309386 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/self_attention/key/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.309406 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/self_attention/out/kernel Shape: float32[16,128,2048] Logical: P('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0415 23:50:51.309427 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/self_attention/query/kernel Shape: float32[2048,16,128] Logical: P('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0415 23:50:51.309448 124871028235392 maxtext_utils.py:1623] params/params/decoder/layers_9/self_attention/value/kernel Shape: float32[2048,16,128] Logical: P('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0415 23:50:51.309485 124871028235392 maxtext_utils.py:1623] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: P('embed', 'vocab') Physical: ('fsdp', None) I0415 23:50:51.309517 124871028235392 maxtext_utils.py:1623] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: P('vocab', 'embed') Physical: (None, 'fsdp') I0415 23:50:55.016654 124871028235392 train.py:155] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length'). I0415 23:50:55.016798 124871028235392 train.py:155] train/xent Physical: float32[8,2048]............................................. ('fsdp', None). I0415 23:50:55.030321 124871028235392 train.py:162] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length'). I0415 23:50:55.030371 124871028235392 train.py:162] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None). W0415 23:50:57.692861 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0415 23:50:59.966397 124871028235392 max_utils.py:791] Total memory size: 3.3 GB, Output size: 1.5 GB, Temp size: 1.8 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB. I0415 23:50:59.967069 124871028235392 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter. I0415 23:50:59.967529 124871028235392 metric_logger.py:289] number parameters: 1.104 billion W0415 23:51:05.781628 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0415 23:51:07.598885 124871028235392 checkpointing.py:772] Waiting for step 0 to finish before checkpoint... I0415 23:51:07.701929 124871028235392 checkpointing.py:776] Waited 0.1030275821685791 seconds for step 0 to finish before starting checkpointing. I0415 23:51:07.702898 124871028235392 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0415 23:51:07.703089 124871028235392 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0 I0415 23:51:07.703448 124871028235392 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0. I0415 23:51:07.785254 124871028235392 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient I0415 23:51:07.866208 124763624179264 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0 I0415 23:51:07.930417 124871028235392 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array. I0415 23:51:07.930516 124871028235392 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0415 23:51:09.067570 124871028235392 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 1.137950s I0415 23:51:09.067915 124871028235392 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 9.631 GiB/s (total gbytes: 12.3 GiB) (time elapsed: a second) (per-host) I0415 23:51:09.067972 124871028235392 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 1.281417s (batch_requests_ready=0.117867s, total_serialization_initiated=1.163292s, others=0.000258s) I0415 23:51:09.068082 124871028235392 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 1.282378s (all_items=0.000020s, per_item={'items': '0.00001979'}, temp_paths=1.282358) I0415 23:51:09.068811 124763557070400 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0415 23:51:09.068911 124871028235392 async_checkpointer.py:561] Finished blocking save. Time taken: 1.365782s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0. I0415 23:51:09.069111 124871028235392 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0415 23:51:09.069281 124871028235392 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776297067.7028852, 'wait_for_prev_duration_secs': 4.1961669921875e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776297067.7031124, 'checkpointer_blocking_duration_secs': 1.365915298461914, 'get_old_steps_start_time': 1776297069.069059, 'get_old_steps_duration_secs': 1.9550323486328125e-05, 'checkpoint_manager_blocking_start_time': 1776297067.7027917, 'checkpoint_manager_blocking_duration_secs': 1.3664696216583252} I0415 23:51:09.069403 124871028235392 checkpointing.py:408] Started an asynchronous checkpoint save for step 0 I0415 23:51:09.069443 124871028235392 max_utils.py:750] Memstats: After params initialized: I0415 23:51:09.069694 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_0(process=0,(0,0,0,0)) I0415 23:51:09.069722 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_1(process=0,(1,0,0,0)) I0415 23:51:09.069741 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_2(process=0,(0,1,0,0)) I0415 23:51:09.069758 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_3(process=0,(1,1,0,0)) I0415 23:51:09.069774 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_4(process=0,(0,2,0,0)) I0415 23:51:09.069790 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_5(process=0,(1,2,0,0)) I0415 23:51:09.069806 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_6(process=0,(0,3,0,0)) I0415 23:51:09.069821 124871028235392 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_7(process=0,(1,3,0,0)) W0415 23:51:09.074733 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0415 23:51:09.079560 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. W0415 23:51:09.084091 3395624 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions. I0415 23:51:09.089289 124763613693504 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/items I0415 23:51:09.209969 124763578041920 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776297068976873575, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776297068976873575, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA I0415 23:51:09.210220 124763634665024 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0415 23:51:09.395977 124871028235392 metric_logger.py:185] completed step: 0, seconds: 7.631, TFLOP/s/device: 1.781, Tokens/s/device: 268.385, total_weights: 16384, loss: 10.887 I0415 23:51:09.396876 124871028235392 metric_logger.py:269] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/tensorboard/' I0415 23:51:09.487380 124871028235392 metric_logger.py:185] completed step: 1, seconds: 1.782, TFLOP/s/device: 7.624, Tokens/s/device: 1149.118, total_weights: 16384, loss: 10.887 I0415 23:51:09.596806 124871028235392 metric_logger.py:185] completed step: 2, seconds: 0.023, TFLOP/s/device: 600.749, Tokens/s/device: 90551.355, total_weights: 16384, loss: 9.787 I0415 23:51:09.706658 124871028235392 metric_logger.py:185] completed step: 3, seconds: 0.093, TFLOP/s/device: 146.432, Tokens/s/device: 22071.820, total_weights: 16384, loss: 8.853 I0415 23:51:09.803938 3398010 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json I0415 23:51:09.803998 3398010 google_auth_provider.cc:156] Using OAuth2 AuthProvider I0415 23:51:10.814037 124763624179264 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0 I0415 23:51:29.750922 124763567556160 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 19.977027s (commit=19.531123s, array_metadata_write=0.445904s) I0415 23:51:29.752693 124763557070400 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 575.287 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 21 seconds) (per-host) I0415 23:51:29.752808 124763557070400 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 20.683840s. I0415 23:51:29.979561 124763557070400 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776297068976873575, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA I0415 23:51:30.171394 124763557070400 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0415 23:51:30.374358 124763578041920 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776297068976873575, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA I0415 23:51:30.606461 124763557070400 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0415 23:51:30.607253 124763557070400 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.583523s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/items I0415 23:51:30.607963 124763557070400 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0/items I0415 23:51:30.836674 124763557070400 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0 I0415 23:51:31.496478 124763557070400 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0`. I0415 23:51:31.497132 124763557070400 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 23.794004s. directory=gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/0 I0415 23:51:31.497213 124763557070400 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 22.428250s. I0415 23:51:31.497416 124763634665024 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0415 23:51:31.497527 124763634665024 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0415 23:51:31.497631 124763634665024 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0415 23:51:31.497688 124763634665024 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0415 23:51:31.723001 124871028235392 metric_logger.py:185] completed step: 4, seconds: 0.110, TFLOP/s/device: 123.907, Tokens/s/device: 18676.589, total_weights: 16384, loss: 7.982 I0415 23:51:31.739969 124871028235392 metric_logger.py:185] completed step: 5, seconds: 0.110, TFLOP/s/device: 123.813, Tokens/s/device: 18662.463, total_weights: 16384, loss: 7.248 I0415 23:51:31.840463 124871028235392 metric_logger.py:185] completed step: 6, seconds: 22.019, TFLOP/s/device: 0.617, Tokens/s/device: 93.012, total_weights: 16384, loss: 6.690 I0415 23:51:31.950145 124871028235392 metric_logger.py:185] completed step: 7, seconds: 0.012, TFLOP/s/device: 1091.950, Tokens/s/device: 164590.533, total_weights: 16384, loss: 6.307 I0415 23:51:32.060393 124871028235392 metric_logger.py:185] completed step: 8, seconds: 0.102, TFLOP/s/device: 132.976, Tokens/s/device: 20043.650, total_weights: 16384, loss: 6.070 I0415 23:51:32.168973 124871028235392 checkpointing.py:772] Waiting for step 9 to finish before checkpoint... I0415 23:51:32.174660 124871028235392 checkpointing.py:776] Waited 0.005708217620849609 seconds for step 9 to finish before starting checkpointing. I0415 23:51:32.175397 124871028235392 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0415 23:51:32.175574 124871028235392 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9 I0415 23:51:32.175868 124871028235392 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9. I0415 23:51:32.350088 124763141834304 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9 I0415 23:51:32.393666 124871028235392 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array. I0415 23:51:32.393824 124871028235392 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0415 23:51:32.735489 124871028235392 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.342554s I0415 23:51:32.735899 124871028235392 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 26.315 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 468 milliseconds) (per-host) I0415 23:51:32.735949 124871028235392 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.469037s (batch_requests_ready=0.099506s, total_serialization_initiated=0.369218s, others=0.000312s) I0415 23:51:32.736060 124871028235392 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.469848s (all_items=0.000022s, per_item={'items': '0.00002241'}, temp_paths=0.469826) I0415 23:51:32.736654 124763131348544 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started. I0415 23:51:32.736749 124871028235392 async_checkpointer.py:561] Finished blocking save. Time taken: 0.561127s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9. I0415 23:51:32.736949 124871028235392 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0415 23:51:32.737111 124763393492544 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save. I0415 23:51:32.737227 124871028235392 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776297092.1753488, 'wait_for_prev_duration_secs': 7.724761962890625e-05, 'time_between_consecutive_saves_sec': 0.6776444911956787, 'checkpointer_blocking_start_time': 1776297092.1756015, 'checkpointer_blocking_duration_secs': 0.5612533092498779, 'get_old_steps_start_time': 1776297092.736873, 'get_old_steps_duration_secs': 3.6716461181640625e-05, 'checkpoint_manager_blocking_start_time': 1776297092.175303, 'checkpoint_manager_blocking_duration_secs': 0.5618913173675537} I0415 23:51:32.737421 124871028235392 checkpointing.py:408] Started an asynchronous checkpoint save for step 9 I0415 23:51:32.737460 124871028235392 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0415 23:51:33.057790 124762114229824 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9/items I0415 23:51:34.315546 124761631884864 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0 I0415 23:52:02.032594 124763351549504 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 28.508103s (commit=28.073869s, array_metadata_write=0.434234s) I0415 23:52:02.034397 124763131348544 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 424.517 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 29 seconds) (per-host) I0415 23:52:02.034452 124763131348544 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 29.297646s. I0415 23:52:02.459198 124763131348544 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0. I0415 23:52:02.895912 124763131348544 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389). I0415 23:52:02.896790 124763131348544 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.585893s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9/items I0415 23:52:02.897425 124763131348544 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9/items I0415 23:52:03.153970 124763131348544 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9 I0415 23:52:03.811510 124763131348544 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9`. I0415 23:52:03.812168 124763131348544 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 31.636552s. directory=gs://wanglance-maxtext/linen_ckpt_feat_migrate_nnx_utils_20260415_230805/linen_feat_migrate_nnx_utils_20260415_230805_13_scan_layers_false/checkpoints/9 I0415 23:52:03.812242 124763131348544 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 31.075437s. I0415 23:52:03.812444 124763393492544 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save. I0415 23:52:03.812554 124763393492544 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save. I0415 23:52:03.812623 124763393492544 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0415 23:52:03.812664 124763393492544 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0415 23:52:03.813713 124871028235392 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0415 23:52:03.813848 124871028235392 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0415 23:52:03.815371 124871028235392 metric_logger.py:185] completed step: 9, seconds: 0.110, TFLOP/s/device: 123.910, Tokens/s/device: 18677.100, total_weights: 16384, loss: 5.930 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops