XPK Start: Tue Apr 21 06:58:15 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
`rope_parameters`'s factor field must be a float >= 1, got 40
`rope_parameters`'s beta_fast field must be a float, got 32
`rope_parameters`'s beta_slow field must be a float, got 1
DeepseekV32Config got `key=rope_scaling` in kwargs but hasn't set it as attribute. For RoPE standardization you need to set `self.rope_parameters` in model's config.
2026-04-21 06:58:40.212944: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0421 06:58:40.421265 132263941023552 max_utils.py:273] Attempting to initialize the jax distributed system...
I0421 06:58:49.462120 132263941023552 distributed.py:149] Starting JAX distributed service on [::]:8482
I0421 06:58:49.464508 132263941023552 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-oag78-slice-job-0-0.mt-13-scan-layers-false-oag78:8482
I0421 06:58:50.569955 132263941023552 max_utils.py:284] Jax distributed system initialized!
I0421 06:58:55.781687 132263941023552 max_utils.py:800] System Information: Jax Version: 0.9.2
I0421 06:58:55.781790 132263941023552 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0421 06:58:55.781830 132263941023552 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0421 06:58:55.781865 132263941023552 train_utils.py:361] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0421 06:58:56.476445 132263941023552 maxtext_utils.py:1565] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0421 06:58:56.476739 132263941023552 checkpointing.py:677] Setting up checkpoint logger...
I0421 06:58:56.476795 132263941023552 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0421 06:58:56.476839 132263941023552 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0421 06:58:56.477180 132263941023552 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x784a8310a6c0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0421 06:58:59.352684 132263941023552 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0421 06:58:59.352922 132263941023552 checkpoint_manager.py:708] [process=4][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7836d053fcb0>}, handler_registry=None
I0421 06:58:59.353159 132263941023552 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7836d053fcb0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0421 06:58:59.353209 132263941023552 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7836d0544f20>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0421 06:58:59.353244 132263941023552 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7836d053fcb0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7836d053fcb0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7836d0544f20>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7836d0544f20>}).
I0421 06:58:59.353630 132263941023552 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0421 06:58:59.353722 132263941023552 async_checkpointer.py:192] [process=4][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7836d02e5800> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0421 06:59:01.046559 132263941023552 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints
I0421 06:59:01.048771 132263941023552 checkpoint_manager.py:929] [process=4][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7836d0541f40>
I0421 06:59:01.048883 132263941023552 checkpointing.py:301] Checkpoint manager created!
I0421 06:59:02.015985 132263941023552 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0421 06:59:02.016091 132263941023552 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0421 06:59:02.399770 132263941023552 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0421 06:59:02.399860 132263941023552 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0421 06:59:02.416100 132263941023552 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0421 06:59:02.416162 132263941023552 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0421 06:59:02.446075 132263941023552 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0421 06:59:02.446147 132263941023552 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 06:59:02.462528 132263941023552 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0421 06:59:02.462598 132263941023552 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 06:59:02.478831 132263941023552 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0421 06:59:02.478891 132263941023552 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 06:59:02.512967 132263941023552 attentions.py:1198] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0421 06:59:02.513086 132263941023552 attentions.py:1198] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0421 06:59:02.536332 132263941023552 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0421 06:59:02.536408 132263941023552 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0421 06:59:05.423803 132263941023552 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0421 06:59:05.423928 132263941023552 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0421 06:59:12.613374 132263941023552 maxtext_utils.py:1668] params/params/decoder/decoder_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.613509 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.613567 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.613626 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.613683 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.613721 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.613774 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.613824 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.613863 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.613898 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_0/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.613932 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.613965 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.613999 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.614030 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614060 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614093 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614124 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.614157 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.614189 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_1/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614218 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.614249 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.614279 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.614308 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614336 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614367 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614397 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.614428 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.614458 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_10/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614488 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.614518 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.614551 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.614579 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614607 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614649 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614683 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.614714 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.614743 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_11/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614774 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.614803 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.614832 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.614861 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614889 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.614919 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.614949 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.614978 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.615008 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_12/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615037 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615066 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615095 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.615125 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.615169 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.615219 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615255 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.615290 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.615322 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_13/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615353 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615382 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615413 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.615441 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.615469 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.615499 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615533 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.615563 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.615593 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_14/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615623 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615677 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615711 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.615739 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.615767 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.615796 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615825 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.615854 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.615883 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_15/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.615912 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615942 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.615972 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.615999 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616026 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616055 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616086 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.616118 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.616148 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_2/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616177 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.616207 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.616235 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.616263 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616291 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616320 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616349 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.616378 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.616408 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_3/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616437 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.616466 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.616495 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.616523 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616554 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616584 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616613 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.616653 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.616683 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_4/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616713 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.616742 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.616771 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.616798 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616826 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.616855 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616884 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.616914 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.616943 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_5/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.616972 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617001 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617029 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.617056 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617085 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617114 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617143 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.617172 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.617201 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_6/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617232 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617261 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617290 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.617316 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617342 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617370 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617398 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.617428 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.617457 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_7/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617486 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617514 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617547 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.617575 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617602 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617631 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617674 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.617706 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.617736 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_8/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617766 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617795 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0421 06:59:12.617824 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0421 06:59:12.617851 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617878 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0421 06:59:12.617909 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.617939 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0421 06:59:12.617969 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0421 06:59:12.617999 132263941023552 maxtext_utils.py:1668] params/params/decoder/layers_9/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0421 06:59:12.618047 132263941023552 maxtext_utils.py:1668] params/params/decoder/logits_dense/kernel
Shape: float32[2048,32000]
Logical: P('embed_vocab', 'vocab')
Physical: ('fsdp', None)
I0421 06:59:12.618091 132263941023552 maxtext_utils.py:1668] params/params/token_embedder/embedding
Shape: float32[32000,2048]
Logical: P('vocab', 'embed_vocab')
Physical: (None, 'fsdp')
I0421 06:59:17.119037 132263941023552 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0421 06:59:17.119132 132263941023552 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0421 06:59:17.134703 132263941023552 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0421 06:59:17.134763 132263941023552 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0421 07:00:13.693442 132263941023552 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0421 07:00:13.694705 132263941023552 metric_logger.py:301] number parameters: 1.104 billion
I0421 07:01:15.431502 132263941023552 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0421 07:01:15.658585 132263941023552 checkpointing.py:776] Waited 0.22706866264343262 seconds for step 0 to finish before starting checkpointing.
I0421 07:01:15.662553 132263941023552 checkpoint_manager.py:2009] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0421 07:01:15.664432 132263941023552 checkpoint_manager.py:1512] [process=4] Saving checkpoint at step 0
I0421 07:01:15.665830 132263941023552 event_tracking.py:70] [process=4] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints/0.
I0421 07:01:16.427058 132263941023552 signaling_client.py:364] Using JaxDistributedSignalingClient
I0421 07:01:16.428050 132263941023552 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0421 07:01:16.428210 132263941023552 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0421 07:01:16.731359 132263941023552 base_pytree_checkpoint_handler.py:154] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.304835s
I0421 07:01:16.731528 132263941023552 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/blocking_gbytes_per_sec: 4.565 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3379485607147217 s) (per-host)
I0421 07:01:16.731581 132263941023552 base_pytree_checkpoint_handler.py:768] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.338012s (batch_requests_ready=0.016195s, total_serialization_initiated=0.321745s, others=0.000071s)
I0421 07:01:16.731861 132263941023552 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.342288s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=0.342271)
I0421 07:01:16.732710 132263941023552 event_tracking.py:125] [process=4] [async] Finished blocking save in 1.07 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints/0.
I0421 07:01:16.733067 132137688758016 async_checkpointer.py:76] [process=4][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-21 07:21:16.733029
I0421 07:01:16.771225 132263941023552 checkpoint_manager.py:1560] [process=4][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0421 07:01:16.771540 132141432719104 async_checkpointer.py:280] [process=4][thread=save_finalize] Waiting for background save thread=async_save.
I0421 07:01:16.771714 132263941023552 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776754875.6625342, 'wait_for_prev_duration_secs': 6.4849853515625e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776754875.6644723, 'checkpointer_blocking_duration_secs': 1.0687568187713623, 'get_old_steps_start_time': 1776754876.7332535, 'get_old_steps_duration_secs': 3.0517578125e-05, 'checkpoint_manager_blocking_start_time': 1776754875.6607823, 'checkpoint_manager_blocking_duration_secs': 1.1108930110931396}
I0421 07:01:16.771899 132263941023552 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0421 07:01:16.771953 132263941023552 max_utils.py:750]
Memstats: After params initialized:
I0421 07:01:16.772005 132263941023552 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_16(process=4,(0,4,0,0))
I0421 07:01:16.772039 132263941023552 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_17(process=4,(1,4,0,0))
I0421 07:01:16.772063 132263941023552 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_20(process=4,(0,5,0,0))
I0421 07:01:16.772090 132263941023552 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_21(process=4,(1,5,0,0))
I0421 07:01:17.125569 132263941023552 metric_logger.py:196] completed step: 0, seconds: 61.737, TFLOP/s/device: 0.220, Tokens/s/device: 33.173, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0421 07:01:17.269070 132263941023552 metric_logger.py:196] completed step: 1, seconds: 1.685, TFLOP/s/device: 8.061, Tokens/s/device: 1215.070, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0421 07:01:17.677236 132263941023552 metric_logger.py:196] completed step: 2, seconds: 0.015, TFLOP/s/device: 885.155, Tokens/s/device: 133420.195, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28796.832
I0421 07:01:17.813230 132263941023552 metric_logger.py:196] completed step: 3, seconds: 0.409, TFLOP/s/device: 33.248, Tokens/s/device: 5011.575, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16999.453
I0421 07:01:18.093954 132263941023552 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.034, Tokens/s/device: 14324.583, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10779.230
I0421 07:01:18.105237 132263941023552 metric_logger.py:196] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.989, Tokens/s/device: 15071.457, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.187
I0421 07:01:20.497805 2786 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0421 07:01:23.964567 132141441111808 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_4
I0421 07:01:42.013517 132263941023552 metric_logger.py:196] completed step: 6, seconds: 0.282, TFLOP/s/device: 48.230, Tokens/s/device: 7269.758, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.308
I0421 07:01:42.151301 132263941023552 metric_logger.py:196] completed step: 7, seconds: 23.777, TFLOP/s/device: 0.571, Tokens/s/device: 86.135, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.175
I0421 07:01:42.287521 132263941023552 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.599, Tokens/s/device: 14409.648, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.095
I0421 07:01:42.425349 132263941023552 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0421 07:01:42.428825 132263941023552 checkpointing.py:776] Waited 0.0034952163696289062 seconds for step 9 to finish before starting checkpointing.
I0421 07:01:42.431715 132263941023552 checkpoint_manager.py:2020] [process=4][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0421 07:01:49.848477 132137688758016 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/gbytes_per_sec: 47.216 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 33.45483708381653 s) (per-host)
I0421 07:01:49.848624 132137688758016 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.115441s.
I0421 07:01:59.701435 132137688758016 async_checkpointer.py:160] [process=4][thread=async_save] Background save thread done. Time taken: 42.968237s.
I0421 07:01:59.701751 132141432719104 async_checkpointer.py:288] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save.
I0421 07:01:59.701883 132141432719104 async_checkpointer.py:298] [process=4][thread=save_finalize] No errors found in background save thread=async_save.
I0421 07:01:59.701936 132141432719104 checkpoint_manager.py:2137] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0421 07:01:59.705568 132141432719104 checkpoint_manager.py:2146] [process=4][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0421 07:01:59.705794 132263941023552 checkpoint_manager.py:2032] [process=4][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0421 07:01:59.705932 132263941023552 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.274217 seconds. If this number is high, consider checkpointing less frequently.
I0421 07:01:59.708531 132263941023552 checkpoint_manager.py:1512] [process=4] Saving checkpoint at step 9
I0421 07:01:59.710557 132263941023552 event_tracking.py:70] [process=4] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints/9.
I0421 07:02:00.888045 132263941023552 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0421 07:02:00.888213 132263941023552 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0421 07:02:01.009742 132263941023552 base_pytree_checkpoint_handler.py:154] [process=4][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.123262s
I0421 07:02:01.009896 132263941023552 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/blocking_gbytes_per_sec: 9.970 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.15471816062927246 s) (per-host)
I0421 07:02:01.009946 132263941023552 base_pytree_checkpoint_handler.py:768] [process=4][thread=MainThread] Initiated Pytree async_save. Time taken: 0.154777s (batch_requests_ready=0.015873s, total_serialization_initiated=0.138838s, others=0.000066s)
I0421 07:02:01.010238 132263941023552 composite_checkpoint_handler.py:715] [process=4][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.158958s (all_items=0.000016s, per_item={'items': '0.00001574'}, temp_paths=0.158942)
I0421 07:02:01.011015 132263941023552 event_tracking.py:125] [process=4] [async] Finished blocking save in 1.30 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints/9.
I0421 07:02:01.011403 132141432719104 async_checkpointer.py:76] [process=4][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-21 07:22:01.011364
I0421 07:02:01.031365 132263941023552 checkpoint_manager.py:1560] [process=4][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0421 07:02:01.031621 132137655187200 async_checkpointer.py:280] [process=4][thread=save_finalize] Waiting for background save thread=async_save.
I0421 07:02:01.031770 132263941023552 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776754902.431685, 'wait_for_prev_duration_secs': 17.274216890335083, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776754919.70857, 'checkpointer_blocking_duration_secs': 1.3029780387878418, 'get_old_steps_start_time': 1776754921.0115721, 'get_old_steps_duration_secs': 3.0994415283203125e-05, 'checkpoint_manager_blocking_start_time': 1776754902.4297187, 'checkpoint_manager_blocking_duration_secs': 18.60201644897461}
I0421 07:02:01.031977 132263941023552 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0421 07:02:01.032023 132263941023552 checkpoint_manager.py:2020] [process=4][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0421 07:02:07.766698 132141443356416 array_metadata_store.py:203] [process=4][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260421_061359/linen_xpk_main_20260421_061359_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_4
I0421 07:02:44.206477 132141432719104 base_pytree_checkpoint_handler.py:130] [process=4] /jax/orbax/write/gbytes_per_sec: 36.437 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 43.351277351379395 s) (per-host)
I0421 07:02:44.206597 132141432719104 async_checkpointer.py:90] [process=4][thread=async_save] 3 Handler Commit operations completed. Time taken: 43.195080s.
I0421 07:02:54.331454 132141432719104 async_checkpointer.py:160] [process=4][thread=async_save] Background save thread done. Time taken: 53.319922s.
I0421 07:02:54.331746 132137655187200 async_checkpointer.py:288] [process=4][thread=save_finalize] Done with waiting for background save thread=async_save.
I0421 07:02:54.331868 132137655187200 async_checkpointer.py:298] [process=4][thread=save_finalize] No errors found in background save thread=async_save.
I0421 07:02:54.331916 132137655187200 checkpoint_manager.py:2137] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0421 07:02:54.334235 132137655187200 checkpoint_manager.py:2146] [process=4][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0421 07:02:54.334432 132263941023552 checkpoint_manager.py:2032] [process=4][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0421 07:02:54.334585 132263941023552 checkpoint_manager.py:2009] [process=4][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0421 07:02:54.335555 132263941023552 metric_logger.py:196] completed step: 9, seconds: 0.138, TFLOP/s/device: 98.192, Tokens/s/device: 14800.575, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.971
Per train step:
Total TFLOPs: 13.59
split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Tue Apr 21 07:03:05 UTC 2026
EXIT_CODE=0