XPK Start: Sat Apr 18 13:48:13 UTC 2026
PyTorch was not found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
2026-04-18 13:48:37.969383: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0418 13:48:38.147798 135086343771968 max_utils.py:273] Attempting to initialize the jax distributed system...
I0418 13:48:47.188774 135086343771968 distributed.py:149] Starting JAX distributed service on [::]:8482
I0418 13:48:47.191061 135086343771968 distributed.py:172] Connecting to JAX distributed service on mt-13-scan-layers-false-tef63-slice-job-0-0.mt-13-scan-layers-false-tef63:8482
I0418 13:48:48.020365 135086343771968 max_utils.py:284] Jax distributed system initialized!
I0418 13:48:53.158034 135086343771968 max_utils.py:800] System Information: Jax Version: 0.9.2
I0418 13:48:53.158133 135086343771968 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0418 13:48:53.158172 135086343771968 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Mar 4 2026 11:32:08 (1772652728) cl/878335365
I0418 13:48:53.158205 135086343771968 train_utils.py:348] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0418 13:48:53.866006 135086343771968 maxtext_utils.py:1551] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0418 13:48:53.866279 135086343771968 checkpointing.py:677] Setting up checkpoint logger...
I0418 13:48:53.866331 135086343771968 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0418 13:48:53.866380 135086343771968 pytree_checkpoint_handler.py:592] save_device_host_concurrent_bytes=None
I0418 13:48:53.866751 135086343771968 base_pytree_checkpoint_handler.py:441] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7adba6fb1520>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0418 13:48:56.778483 135086343771968 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0418 13:48:56.778713 135086343771968 checkpoint_manager.py:708] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ac7e43f6c30>}, handler_registry=None
I0418 13:48:56.778948 135086343771968 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ac7e43f6c30>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0418 13:48:56.779001 135086343771968 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7ac7e43f9040>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0418 13:48:56.779039 135086343771968 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ac7e43f6c30>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7ac7e43f6c30>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7ac7e43f9040>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7ac7e43f9040>}).
I0418 13:48:56.779355 135086343771968 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.34
I0418 13:48:56.779426 135086343771968 async_checkpointer.py:192] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7ac6dc784180> timeout: 1200 secs and primary_host=0 for async checkpoint writes
I0418 13:48:58.684015 135086343771968 checkpoint_manager.py:1812] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints
I0418 13:48:59.142845 135086343771968 checkpoint_manager.py:929] [process=5][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7ac7e43f7860>
I0418 13:48:59.143025 135086343771968 checkpointing.py:301] Checkpoint manager created!
I0418 13:49:00.085604 135086343771968 nnx_wrappers.py:437] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0418 13:49:00.085702 135086343771968 nnx_wrappers.py:437] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0418 13:49:00.462805 135086343771968 attentions.py:1088] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0418 13:49:00.462893 135086343771968 attentions.py:1088] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0418 13:49:00.478940 135086343771968 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0418 13:49:00.479000 135086343771968 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None).
I0418 13:49:00.508197 135086343771968 attentions.py:1154] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0418 13:49:00.508262 135086343771968 attentions.py:1154] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 13:49:00.524494 135086343771968 attentions.py:1155] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0418 13:49:00.524554 135086343771968 attentions.py:1155] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 13:49:00.540570 135086343771968 attentions.py:1156] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0418 13:49:00.540629 135086343771968 attentions.py:1156] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 13:49:00.576065 135086343771968 attentions.py:1197] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0418 13:49:00.576173 135086343771968 attentions.py:1197] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None).
I0418 13:49:00.604913 135086343771968 linears.py:525] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0418 13:49:00.605006 135086343771968 linears.py:525] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None).
I0418 13:49:03.456355 135086343771968 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0418 13:49:03.456492 135086343771968 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
fsdp: 32
I0418 13:49:10.627205 135086343771968 maxtext_utils.py:1654] params/params/decoder/decoder_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.627331 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.627383 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.627438 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.627493 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.627531 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.627580 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.627629 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.627679 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.627719 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_0/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.627754 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.627788 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.627819 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.627850 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.627879 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.627911 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.627946 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.627980 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.628011 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_1/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628042 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628073 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628103 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.628132 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628160 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628191 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628222 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.628252 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.628282 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_10/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628311 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628341 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628371 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.628414 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628447 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628491 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628523 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.628554 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.628584 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_11/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628613 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628643 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628672 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.628704 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628733 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628763 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628794 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.628825 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.628855 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_12/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.628885 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628914 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.628943 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.628970 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.628998 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629027 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629056 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.629086 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.629114 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_13/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629143 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629173 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629202 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.629230 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629257 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629286 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629316 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.629345 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.629375 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_14/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629404 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629434 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629476 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.629505 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629533 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629563 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629593 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.629624 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.629653 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_15/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629683 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629718 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629749 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.629776 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629804 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.629833 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629864 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.629896 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.629925 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_2/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.629954 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.629983 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630012 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.630039 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630067 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630096 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630125 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.630155 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.630183 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_3/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630211 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630240 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630270 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.630296 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630323 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630352 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630381 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.630410 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.630438 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_4/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630480 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630511 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630540 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.630568 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630594 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630624 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630652 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.630681 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.630715 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_5/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630744 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630773 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.630802 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.630829 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630856 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.630887 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.630916 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.630945 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.630975 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_6/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631003 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.631032 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.631061 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.631088 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.631115 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.631144 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631172 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.631201 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.631230 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_7/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631258 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.631287 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.631315 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.631342 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.631368 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.631397 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631426 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.631464 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.631497 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_8/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631526 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.631555 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0418 13:49:10.631584 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0418 13:49:10.631612 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.631639 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0418 13:49:10.631669 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631701 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0418 13:49:10.631731 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0418 13:49:10.631762 135086343771968 maxtext_utils.py:1654] params/params/decoder/layers_9/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0418 13:49:10.631806 135086343771968 maxtext_utils.py:1654] params/params/decoder/logits_dense/kernel
Shape: float32[2048,32000]
Logical: P('embed_vocab', 'vocab')
Physical: ('fsdp', None)
I0418 13:49:10.631848 135086343771968 maxtext_utils.py:1654] params/params/token_embedder/embedding
Shape: float32[32000,2048]
Logical: P('vocab', 'embed_vocab')
Physical: (None, 'fsdp')
I0418 13:49:15.085653 135086343771968 train.py:155] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0418 13:49:15.085746 135086343771968 train.py:155] train/xent Physical: float32[32,2048]............................................ ('fsdp', None).
I0418 13:49:15.101133 135086343771968 train.py:162] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length').
I0418 13:49:15.101191 135086343771968 train.py:162] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None).
I0418 13:50:11.559489 135086343771968 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.5 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB.
I0418 13:50:11.560711 135086343771968 metric_logger.py:301] number parameters: 1.104 billion
I0418 13:51:12.240567 135086343771968 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0418 13:51:12.476859 135086343771968 checkpointing.py:776] Waited 0.23627209663391113 seconds for step 0 to finish before starting checkpointing.
I0418 13:51:12.480198 135086343771968 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0418 13:51:12.482236 135086343771968 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 0
I0418 13:51:12.483686 135086343771968 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints/0.
I0418 13:51:13.702794 135086343771968 signaling_client.py:364] Using JaxDistributedSignalingClient
I0418 13:51:13.703898 135086343771968 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0418 13:51:13.704051 135086343771968 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0418 13:51:14.020795 135086343771968 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.318543s
I0418 13:51:14.020963 135086343771968 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 4.387 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.3516407012939453 s) (per-host)
I0418 13:51:14.021018 135086343771968 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.351706s (batch_requests_ready=0.016816s, total_serialization_initiated=0.334817s, others=0.000072s)
I0418 13:51:14.021284 135086343771968 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.356108s (all_items=0.000018s, per_item={'items': '0.00001836'}, temp_paths=0.356090)
I0418 13:51:14.022059 135086343771968 event_tracking.py:125] [process=5] [async] Finished blocking save in 1.54 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints/0.
I0418 13:51:14.022418 134960838104832 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-18 14:11:14.022385
I0418 13:51:14.052897 135086343771968 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0418 13:51:14.053203 134964063725312 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0418 13:51:14.053357 135086343771968 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776520272.4801793, 'wait_for_prev_duration_secs': 6.103515625e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776520272.4822738, 'checkpointer_blocking_duration_secs': 1.5402607917785645, 'get_old_steps_start_time': 1776520274.0225568, 'get_old_steps_duration_secs': 2.5272369384765625e-05, 'checkpoint_manager_blocking_start_time': 1776520272.4784749, 'checkpoint_manager_blocking_duration_secs': 1.574847936630249}
I0418 13:51:14.053542 135086343771968 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0418 13:51:14.053593 135086343771968 max_utils.py:750]
Memstats: After params initialized:
I0418 13:51:14.053643 135086343771968 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_18(process=5,(2,4,0,0))
I0418 13:51:14.053676 135086343771968 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_19(process=5,(3,4,0,0))
I0418 13:51:14.053703 135086343771968 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_22(process=5,(2,5,0,0))
I0418 13:51:14.053727 135086343771968 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_23(process=5,(3,5,0,0))
I0418 13:51:14.403320 135086343771968 metric_logger.py:196] completed step: 0, seconds: 60.680, TFLOP/s/device: 0.224, Tokens/s/device: 33.751, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0418 13:51:14.548415 135086343771968 metric_logger.py:196] completed step: 1, seconds: 2.154, TFLOP/s/device: 6.309, Tokens/s/device: 950.897, total_weights: 65536, loss: 10.877, lm_loss: 10.877, perplexity: 52938.617
I0418 13:51:14.973775 135086343771968 metric_logger.py:196] completed step: 2, seconds: 0.018, TFLOP/s/device: 762.337, Tokens/s/device: 114907.704, total_weights: 65536, loss: 10.268, lm_loss: 10.268, perplexity: 28789.904
I0418 13:51:15.110116 135086343771968 metric_logger.py:196] completed step: 3, seconds: 0.426, TFLOP/s/device: 31.898, Tokens/s/device: 4807.986, total_weights: 65536, loss: 9.741, lm_loss: 9.741, perplexity: 16998.150
I0418 13:51:15.389220 135086343771968 metric_logger.py:196] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.179, Tokens/s/device: 14346.358, total_weights: 65536, loss: 9.285, lm_loss: 9.285, perplexity: 10777.912
I0418 13:51:15.400831 135086343771968 metric_logger.py:196] completed step: 5, seconds: 0.137, TFLOP/s/device: 99.444, Tokens/s/device: 14989.278, total_weights: 65536, loss: 8.901, lm_loss: 8.901, perplexity: 7336.293
I0418 13:51:18.351915 2764 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0418 13:51:20.899430 134964072118016 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_5
I0418 13:51:39.987125 135086343771968 metric_logger.py:196] completed step: 6, seconds: 0.280, TFLOP/s/device: 48.582, Tokens/s/device: 7322.759, total_weights: 65536, loss: 8.602, lm_loss: 8.602, perplexity: 5440.174
I0418 13:51:40.124882 135086343771968 metric_logger.py:196] completed step: 7, seconds: 24.455, TFLOP/s/device: 0.556, Tokens/s/device: 83.745, total_weights: 65536, loss: 8.393, lm_loss: 8.393, perplexity: 4418.041
I0418 13:51:40.261074 135086343771968 metric_logger.py:196] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.802, Tokens/s/device: 14440.331, total_weights: 65536, loss: 8.264, lm_loss: 8.264, perplexity: 3882.552
I0418 13:51:40.399011 135086343771968 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0418 13:51:40.402363 135086343771968 checkpointing.py:776] Waited 0.003378152847290039 seconds for step 9 to finish before starting checkpointing.
I0418 13:51:40.405472 135086343771968 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0418 13:51:48.248216 134960838104832 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 45.681 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 34.57885932922363 s) (per-host)
I0418 13:51:48.248338 134960838104832 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 34.225265s.
I0418 13:51:57.659425 134960838104832 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 43.636337s.
I0418 13:51:57.659729 134964063725312 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0418 13:51:57.659845 134964063725312 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0418 13:51:57.659895 134964063725312 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0418 13:51:57.662445 134964063725312 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0418 13:51:57.662631 135086343771968 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0418 13:51:57.662760 135086343771968 checkpoint_manager.py:1452] Waiting for previous save to complete took 17.257305 seconds. If this number is high, consider checkpointing less frequently.
I0418 13:51:57.665398 135086343771968 checkpoint_manager.py:1512] [process=5] Saving checkpoint at step 9
I0418 13:51:57.667436 135086343771968 event_tracking.py:70] [process=5] [async] Started save checkpoint @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints/9.
I0418 13:51:58.419292 135086343771968 jax_array_handlers.py:360] Scheduling D2H of 444 prioritized jax.Array.
I0418 13:51:58.419465 135086343771968 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0418 13:51:58.546051 135086343771968 base_pytree_checkpoint_handler.py:154] [process=5][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.128099s
I0418 13:51:58.546212 135086343771968 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/blocking_gbytes_per_sec: 9.635 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.16009998321533203 s) (per-host)
I0418 13:51:58.546260 135086343771968 base_pytree_checkpoint_handler.py:768] [process=5][thread=MainThread] Initiated Pytree async_save. Time taken: 0.160157s (batch_requests_ready=0.016508s, total_serialization_initiated=0.143584s, others=0.000066s)
I0418 13:51:58.546561 135086343771968 composite_checkpoint_handler.py:715] [process=5][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.164485s (all_items=0.000015s, per_item={'items': '0.00001550'}, temp_paths=0.164469)
I0418 13:51:58.547327 135086343771968 event_tracking.py:125] [process=5] [async] Finished blocking save in 0.88 seconds. Continuing save @ gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints/9.
I0418 13:51:58.547656 134964063725312 async_checkpointer.py:76] [process=5][thread=async_save] Background save thread started. Deadline for this save operation is 2026-04-18 14:11:58.547622
I0418 13:51:58.552970 135086343771968 checkpoint_manager.py:1560] [process=5][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0418 13:51:58.553268 134961376974592 async_checkpointer.py:280] [process=5][thread=save_finalize] Waiting for background save thread=async_save.
I0418 13:51:58.553431 135086343771968 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776520300.4054263, 'wait_for_prev_duration_secs': 17.257305145263672, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776520317.6654377, 'checkpointer_blocking_duration_secs': 0.8823788166046143, 'get_old_steps_start_time': 1776520318.547842, 'get_old_steps_duration_secs': 3.147125244140625e-05, 'checkpoint_manager_blocking_start_time': 1776520300.40322, 'checkpoint_manager_blocking_duration_secs': 18.15017580986023}
I0418 13:51:58.553644 135086343771968 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0418 13:51:58.553695 135086343771968 checkpoint_manager.py:2020] [process=5][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0418 13:52:04.924393 134964072118016 array_metadata_store.py:203] [process=5][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_main_20260418_125719/linen_xpk_main_20260418_125719_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_5
I0418 13:52:40.464184 134964063725312 base_pytree_checkpoint_handler.py:130] [process=5] /jax/orbax/write/gbytes_per_sec: 37.540 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.0780291557312 s) (per-host)
I0418 13:52:40.464312 134964063725312 async_checkpointer.py:90] [process=5][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.916524s.
I0418 13:52:51.185055 134964063725312 async_checkpointer.py:160] [process=5][thread=async_save] Background save thread done. Time taken: 52.637254s.
I0418 13:52:51.185330 134961376974592 async_checkpointer.py:288] [process=5][thread=save_finalize] Done with waiting for background save thread=async_save.
I0418 13:52:51.185447 134961376974592 async_checkpointer.py:298] [process=5][thread=save_finalize] No errors found in background save thread=async_save.
I0418 13:52:51.185515 134961376974592 checkpoint_manager.py:2137] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0418 13:52:51.187413 134961376974592 checkpoint_manager.py:2146] [process=5][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0418 13:52:51.187588 135086343771968 checkpoint_manager.py:2032] [process=5][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0418 13:52:51.187753 135086343771968 checkpoint_manager.py:2009] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0418 13:52:51.188685 135086343771968 metric_logger.py:196] completed step: 9, seconds: 0.138, TFLOP/s/device: 98.593, Tokens/s/device: 14861.041, total_weights: 65536, loss: 8.188, lm_loss: 8.188, perplexity: 3598.678
Per train step:
Total TFLOPs: 13.59
split as 93.93% learnable weight flops and 6.07% attention flops
XPK End: Sat Apr 18 13:53:04 UTC 2026
EXIT_CODE=0