XPK Start: Thu Apr 23 16:37:17 UTC 2026 2026-04-23 16:37:21.785828: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered WARNING: All log messages before absl::InitializeLog() is called are written to STDERR E0000 00:00:1776962241.798706 10 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered E0000 00:00:1776962241.802672 10 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered W0000 00:00:1776962241.814060 10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once. W0000 00:00:1776962241.814080 10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once. W0000 00:00:1776962241.814083 10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once. W0000 00:00:1776962241.814085 10 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once. 2026-04-23 16:37:40.974530: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303) I0423 16:37:41.497771 138716912764736 max_utils.py:273] Attempting to initialize the jax distributed system... INFO:2026-04-23 16:37:50,540:jax._src.distributed:140: Starting JAX distributed service on [::]:8482 I0423 16:37:50.540167 138716912764736 distributed.py:140] Starting JAX distributed service on [::]:8482 INFO:2026-04-23 16:37:50,542:jax._src.distributed:157: Connecting to JAX distributed service on mt-13-scan-layers-false-3w13e-slice-job-0-0.mt-13-scan-layers-false-3w13e:8482 I0423 16:37:50.542410 138716912764736 distributed.py:157] Connecting to JAX distributed service on mt-13-scan-layers-false-3w13e-slice-job-0-0.mt-13-scan-layers-false-3w13e:8482 I0423 16:37:51.671136 138716912764736 max_utils.py:284] Jax distributed system initialized! I0423 16:37:58.535441 138716912764736 max_utils.py:800] System Information: Jax Version: 0.8.1 I0423 16:37:58.535551 138716912764736 max_utils.py:801] System Information: Jaxlib Version: 0.8.1 I0423 16:37:58.535594 138716912764736 max_utils.py:802] System Information: Jax Backend: PJRT C API TFRT TPU v6 lite Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709 I0423 16:37:58.535628 138716912764736 train_utils.py:377] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing. I0423 16:37:59.154981 138716912764736 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0423 16:37:59.155588 138716912764736 maxtext_utils.py:1631] Num_devices: 32, shape (1, 1, 1, 32, 1, 1, 1, 1, 1, 1, 1, 1, 1) I0423 16:37:59.155775 138716912764736 checkpointing.py:688] Setting up checkpoint logger... I0423 16:37:59.155825 138716912764736 checkpointing.py:234] Creating checkpoint manager with ocdbt=True and zarr3=True I0423 16:37:59.155869 138716912764736 pytree_checkpoint_handler.py:589] save_device_host_concurrent_bytes=None I0423 16:37:59.156245 138716912764736 base_pytree_checkpoint_handler.py:415] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7e290f74f9b0>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB) I0423 16:38:02.120423 138716912764736 checkpointing.py:266] Enabling policy for fixed interval checkpointing. I0423 16:38:02.120770 138716912764736 checkpoint_manager.py:709] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e1691f717c0>}, handler_registry=None I0423 16:38:02.121001 138716912764736 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e1691f717c0>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`. I0423 16:38:02.121050 138716912764736 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e1691f76ed0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`. I0423 16:38:02.121086 138716912764736 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e1691f717c0>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7e1691f717c0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e1691f76ed0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7e1691f76ed0>}). I0423 16:38:02.121609 138716912764736 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.33 I0423 16:38:02.121688 138716912764736 async_checkpointer.py:177] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7e1470440180> timeout: 600 secs and primary_host=0 for async checkpoint writes I0423 16:38:03.594804 138716912764736 checkpoint_manager.py:1818] Found 0 checkpoint steps in gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints I0423 16:38:03.678042 138716912764736 checkpoint_manager.py:929] [process=6][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False, lightweight_initialize=False), root_directory=gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7e1691f719d0> I0423 16:38:03.678246 138716912764736 checkpointing.py:302] Checkpoint manager created! I0423 16:38:04.619714 138716912764736 nnx_wrappers.py:455] Unknown Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_norm_length', 'activation_embed'). I0423 16:38:04.619896 138716912764736 nnx_wrappers.py:455] Unknown Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0423 16:38:05.390814 138716912764736 attentions.py:1084] attentions/inputs_q Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0423 16:38:05.390950 138716912764736 attentions.py:1084] attentions/inputs_q Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0423 16:38:05.532203 138716912764736 attentions.py:1085] attentions/inputs_kv Logical: bfloat16[32,2048,2048]...................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed'). I0423 16:38:05.532342 138716912764736 attentions.py:1085] attentions/inputs_kv Physical: bfloat16[32,2048,2048]...................................... ('fsdp', None, None). I0423 16:38:05.682388 138716912764736 attentions.py:1150] attentions/query Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0423 16:38:05.682528 138716912764736 attentions.py:1150] attentions/query Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0423 16:38:05.821759 138716912764736 attentions.py:1151] attentions/key Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0423 16:38:05.821896 138716912764736 attentions.py:1151] attentions/key Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0423 16:38:05.964170 138716912764736 attentions.py:1152] attentions/value Logical: bfloat16[32,2048,16,128].................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim'). I0423 16:38:05.964306 138716912764736 attentions.py:1152] attentions/value Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0423 16:38:06.114753 138716912764736 attentions.py:1193] attentions/out Logical: bfloat16[32,2048,16,128].................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv'). I0423 16:38:06.114890 138716912764736 attentions.py:1193] attentions/out Physical: bfloat16[32,2048,16,128].................................... ('fsdp', None, None, None). I0423 16:38:06.137970 138716912764736 linears.py:541] linears/x Logical: bfloat16[32,2048,7168]...................................... ('activation_batch', 'activation_length', 'activation_mlp'). I0423 16:38:06.138068 138716912764736 linears.py:541] linears/x Physical: bfloat16[32,2048,7168]...................................... ('fsdp', None, None). I0423 16:38:20.508308 138716912764736 checkpointing.py:578] checkpoint manager exists so trying to load this run's existing checkpoint I0423 16:38:20.508436 138716912764736 checkpointing.py:676] No existing checkpoints found, not restoring checkpoint. fsdp: 32 I0423 16:38:29.350506 138716912764736 maxtext_utils.py:1740] params/params/decoder/decoder_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.350640 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.350695 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.350754 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.350794 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.350831 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.350884 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.350937 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.350977 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.351013 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_0/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.351050 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.351085 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.351144 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.351179 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.351212 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.351246 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.351283 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.351317 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.351368 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_1/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.351406 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.351439 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.351490 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.351545 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.351582 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.351617 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.351650 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.351683 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.351715 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_10/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.351746 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.351778 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.351809 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.351838 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.351867 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.351898 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.351929 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.351961 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.351992 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_11/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352022 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352053 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352084 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.352129 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352160 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352193 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352225 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.352260 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.352292 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_12/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352323 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352354 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352387 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.352418 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352448 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352484 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352517 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.352548 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.352579 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_13/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352610 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352640 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352671 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.352699 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352727 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352758 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352789 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.352820 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.352850 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_14/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.352881 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352912 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.352942 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.352969 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.352997 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353027 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353058 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.353088 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.353136 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_15/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353167 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.353198 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.353228 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.353256 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353284 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353315 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353347 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.353377 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.353408 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_2/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353437 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.353467 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.353502 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.353530 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353558 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353589 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353619 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.353649 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.353680 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_3/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353711 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.353741 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.353770 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.353798 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353825 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.353854 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353884 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.353914 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.353944 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_4/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.353974 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354003 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354033 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.354061 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354087 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354133 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354165 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.354195 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.354225 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_5/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354255 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354287 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354317 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.354345 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354373 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354403 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354434 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.354465 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.354500 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_6/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354532 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354562 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354592 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.354619 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354647 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354677 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354707 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.354738 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.354768 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_7/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354798 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354828 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.354858 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.354885 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354912 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.354941 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.354972 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.355004 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.355034 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_8/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.355065 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/mlp/wi_0/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.355112 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/mlp/wi_1/kernel Shape: float32[2048,7168] Logical: PartitionSpec('embed', 'mlp') Physical: ('fsdp', None) I0423 16:38:29.355148 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/mlp/wo/kernel Shape: float32[7168,2048] Logical: PartitionSpec('mlp', 'embed') Physical: (None, 'fsdp') I0423 16:38:29.355177 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/post_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.355205 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale Shape: float32[2048] Logical: PartitionSpec('norm',) Physical: (None,) I0423 16:38:29.355236 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/self_attention/key/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.355266 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/self_attention/out/kernel Shape: float32[16,128,2048] Logical: PartitionSpec('heads', 'kv', 'embed') Physical: (None, None, 'fsdp') I0423 16:38:29.355297 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/self_attention/query/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'q_heads', 'kv') Physical: ('fsdp', None, None) I0423 16:38:29.355327 138716912764736 maxtext_utils.py:1740] params/params/decoder/layers_9/self_attention/value/kernel Shape: float32[2048,16,128] Logical: PartitionSpec('embed', 'kv_heads', 'kv_head_dim') Physical: ('fsdp', None, None) I0423 16:38:29.355376 138716912764736 maxtext_utils.py:1740] params/params/decoder/logits_dense/kernel Shape: float32[2048,32000] Logical: PartitionSpec('embed', 'vocab') Physical: ('fsdp', None) I0423 16:38:29.355421 138716912764736 maxtext_utils.py:1740] params/params/token_embedder/embedding Shape: float32[32000,2048] Logical: PartitionSpec('vocab', 'embed') Physical: (None, 'fsdp') I0423 16:38:33.271581 138716912764736 train.py:158] train/xent Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0423 16:38:33.271681 138716912764736 train.py:158] train/xent Physical: float32[32,2048]............................................ ('fsdp', None). I0423 16:38:33.287384 138716912764736 train.py:165] train/z_loss Logical: float32[32,2048]............................................ ('activation_embed_and_logits_batch', 'activation_length'). I0423 16:38:33.287445 138716912764736 train.py:165] train/z_loss Physical: float32[32,2048]............................................ ('fsdp', None). I0423 16:39:30.113789 138716912764736 max_utils.py:791] Total memory size: 1.8 GB, Output size: 0.4 GB, Temp size: 1.4 GB, Argument size: 0.4 GB, Host temp size: 0.0 GB. I0423 16:39:30.117411 138716912764736 metric_logger.py:289] number parameters: 1.104 billion I0423 16:40:32.206298 138716912764736 checkpointing.py:794] Waiting for step 0 to finish before checkpoint... I0423 16:40:32.493767 138716912764736 checkpointing.py:798] Waited 0.28745174407958984 seconds for step 0 to finish before starting checkpointing. I0423 16:40:32.500065 138716912764736 checkpoint_manager.py:2013] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0423 16:40:32.501898 138716912764736 checkpoint_manager.py:1518] [process=6] Saving checkpoint at step 0 I0423 16:40:32.504818 138716912764736 async_checkpointer.py:452] [process=6] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints/0. I0423 16:40:33.131926 138716912764736 signaling_client.py:364] Using JaxDistributedSignalingClient I0423 16:40:33.133022 138716912764736 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array. I0423 16:40:33.133675 138716912764736 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0423 16:40:33.453086 138716912764736 base_pytree_checkpoint_handler.py:153] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.321646s I0423 16:40:33.453289 138716912764736 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/blocking_gbytes_per_sec: 2.506 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.615494966506958 s) (per-host) I0423 16:40:33.453345 138716912764736 base_pytree_checkpoint_handler.py:737] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.615555s (batch_requests_ready=0.275051s, total_serialization_initiated=0.340436s, others=0.000068s) I0423 16:40:33.453604 138716912764736 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.620838s (all_items=0.000021s, per_item={'items': '0.00002050'}, temp_paths=0.620818) I0423 16:40:33.454612 138590997284608 async_checkpointer.py:79] [process=6][thread=async_save] Background save thread started. I0423 16:40:33.454785 138716912764736 async_checkpointer.py:561] Finished blocking save. Time taken: 0.952815s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints/0. I0423 16:40:33.466459 138716912764736 checkpoint_manager.py:1566] [process=6][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize I0423 16:40:33.466717 138589929203456 async_checkpointer.py:265] [process=6][thread=save_finalize] Waiting for background save thread=async_save. I0423 16:40:33.466875 138716912764736 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776962432.5000458, 'wait_for_prev_duration_secs': 8.058547973632812e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776962432.5019362, 'checkpointer_blocking_duration_secs': 0.9529526233673096, 'get_old_steps_start_time': 1776962433.4549088, 'get_old_steps_duration_secs': 2.6941299438476562e-05, 'checkpoint_manager_blocking_start_time': 1776962432.497288, 'checkpoint_manager_blocking_duration_secs': 0.9695479869842529} I0423 16:40:33.467140 138716912764736 checkpointing.py:409] Started an asynchronous checkpoint save for step 0 I0423 16:40:33.467195 138716912764736 max_utils.py:750] Memstats: After params initialized: I0423 16:40:33.467249 138716912764736 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_24(process=6,(0,6,0,0)) I0423 16:40:33.467281 138716912764736 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_25(process=6,(1,6,0,0)) I0423 16:40:33.467307 138716912764736 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_28(process=6,(0,7,0,0)) I0423 16:40:33.467331 138716912764736 max_utils.py:756] Using (GB) 0.82 / 31.25 (2.624000%) on TPU_29(process=6,(1,7,0,0)) I0423 16:40:33.783136 138716912764736 metric_logger.py:185] completed step: 0, seconds: 62.089, TFLOP/s/device: 0.219, Tokens/s/device: 32.985, total_weights: 65536, loss: 10.877 I0423 16:40:33.932912 138716912764736 metric_logger.py:185] completed step: 1, seconds: 1.565, TFLOP/s/device: 8.680, Tokens/s/device: 1308.279, total_weights: 65536, loss: 10.877 I0423 16:40:34.370527 138716912764736 metric_logger.py:185] completed step: 2, seconds: 0.025, TFLOP/s/device: 538.915, Tokens/s/device: 81231.160, total_weights: 65536, loss: 10.268 I0423 16:40:34.506787 138716912764736 metric_logger.py:185] completed step: 3, seconds: 0.438, TFLOP/s/device: 31.003, Tokens/s/device: 4673.174, total_weights: 65536, loss: 9.741 I0423 16:40:34.785197 138716912764736 metric_logger.py:185] completed step: 4, seconds: 0.143, TFLOP/s/device: 95.068, Tokens/s/device: 14329.695, total_weights: 65536, loss: 9.285 I0423 16:40:34.798471 138716912764736 metric_logger.py:185] completed step: 5, seconds: 0.136, TFLOP/s/device: 99.789, Tokens/s/device: 15041.239, total_weights: 65536, loss: 8.900 I0423 16:40:57.476741 138716912764736 metric_logger.py:185] completed step: 6, seconds: 0.280, TFLOP/s/device: 48.511, Tokens/s/device: 7312.170, total_weights: 65536, loss: 8.602 I0423 16:40:57.613154 138716912764736 metric_logger.py:185] completed step: 7, seconds: 22.548, TFLOP/s/device: 0.603, Tokens/s/device: 90.828, total_weights: 65536, loss: 8.394 I0423 16:40:57.749363 138716912764736 metric_logger.py:185] completed step: 8, seconds: 0.142, TFLOP/s/device: 95.793, Tokens/s/device: 14438.906, total_weights: 65536, loss: 8.264 I0423 16:40:57.885456 138716912764736 checkpointing.py:794] Waiting for step 9 to finish before checkpoint... I0423 16:40:57.889228 138716912764736 checkpointing.py:798] Waited 0.0037887096405029297 seconds for step 9 to finish before starting checkpointing. I0423 16:40:57.891850 138716912764736 checkpoint_manager.py:2024] [process=6][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0423 16:40:58.659193 2826 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com I0423 16:41:00.664639 138591005677312 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_6 I0423 16:41:28.849737 138590997284608 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/gbytes_per_sec: 28.201 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 56.011903524398804 s) (per-host) I0423 16:41:28.849849 138590997284608 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 55.395126s. I0423 16:41:38.123908 138590997284608 async_checkpointer.py:144] [process=6][thread=async_save] Background save thread done. Time taken: 64.669169s. I0423 16:41:38.124217 138589929203456 async_checkpointer.py:273] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save. I0423 16:41:38.124349 138589929203456 async_checkpointer.py:283] [process=6][thread=save_finalize] No errors found in background save thread=async_save. I0423 16:41:38.124402 138589929203456 checkpoint_manager.py:2133] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts... I0423 16:41:38.128834 138589929203456 checkpoint_manager.py:2142] [process=6][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts. I0423 16:41:38.129028 138716912764736 checkpoint_manager.py:2036] [process=6][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0. W0423 16:41:38.129194 138716912764736 checkpoint_manager.py:1458] Waiting for previous save to complete took 40.237348 seconds. If this number is high, consider checkpointing less frequently. I0423 16:41:38.131464 138716912764736 checkpoint_manager.py:1518] [process=6] Saving checkpoint at step 9 I0423 16:41:38.134816 138716912764736 async_checkpointer.py:452] [process=6] Started async saving checkpoint to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints/9. I0423 16:41:38.707069 138716912764736 jax_array_handlers.py:358] Scheduling D2H of 444 prioritized jax.Array. I0423 16:41:38.707780 138716912764736 replica_slices.py:424] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False I0423 16:41:38.858449 138716912764736 base_pytree_checkpoint_handler.py:153] [process=6][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.152798s I0423 16:41:38.858614 138716912764736 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/blocking_gbytes_per_sec: 3.433 GiB/s (total gbytes: 1.5 GiB) (time elapsed: 0.44927382469177246 s) (per-host) I0423 16:41:38.858664 138716912764736 base_pytree_checkpoint_handler.py:737] [process=6][thread=MainThread] Initiated Pytree async_save. Time taken: 0.449333s (batch_requests_ready=0.279269s, total_serialization_initiated=0.169998s, others=0.000066s) I0423 16:41:38.858946 138716912764736 composite_checkpoint_handler.py:715] [process=6][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.453565s (all_items=0.000017s, per_item={'items': '0.00001693'}, temp_paths=0.453548) I0423 16:41:38.859912 138591005677312 async_checkpointer.py:79] [process=6][thread=async_save] Background save thread started. I0423 16:41:38.860044 138716912764736 async_checkpointer.py:561] Finished blocking save. Time taken: 0.728505s. Continuing background save to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints/9. I0423 16:41:38.862155 138716912764736 checkpoint_manager.py:1566] [process=6][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize I0423 16:41:38.862383 138589929203456 async_checkpointer.py:265] [process=6][thread=save_finalize] Waiting for background save thread=async_save. I0423 16:41:38.862483 138716912764736 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776962457.8918211, 'wait_for_prev_duration_secs': 40.237348318099976, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776962498.131504, 'checkpointer_blocking_duration_secs': 0.7287194728851318, 'get_old_steps_start_time': 1776962498.8602443, 'get_old_steps_duration_secs': 3.2901763916015625e-05, 'checkpoint_manager_blocking_start_time': 1776962457.8901894, 'checkpoint_manager_blocking_duration_secs': 40.97226119041443} I0423 16:41:38.862734 138716912764736 checkpointing.py:409] Started an asynchronous checkpoint save for step 9 I0423 16:41:38.862782 138716912764736 checkpoint_manager.py:2024] [process=6][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete. I0423 16:41:43.737768 138629287966464 array_metadata_store.py:203] [process=6][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://lance-maxtext/linen_ckpt_xpk_feat_nnx_set_defaults_true_20260423_155242/linen_xpk_feat_nnx_set_defaults_true_20260423_155242_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_6 I0423 16:42:20.415452 138591005677312 base_pytree_checkpoint_handler.py:129] [process=6] /jax/checkpoint/write/gbytes_per_sec: 37.604 MiB/s (total gbytes: 1.5 GiB) (time elapsed: 42.0060670375824 s) (per-host) I0423 16:42:20.415576 138591005677312 async_checkpointer.py:90] [process=6][thread=async_save] 3 Handler Commit operations completed. Time taken: 41.555444s. I0423 16:42:29.483399 138591005677312 async_checkpointer.py:144] [process=6][thread=async_save] Background save thread done. Time taken: 50.623251s. I0423 16:42:29.483696 138589929203456 async_checkpointer.py:273] [process=6][thread=save_finalize] Done with waiting for background save thread=async_save. I0423 16:42:29.483823 138589929203456 async_checkpointer.py:283] [process=6][thread=save_finalize] No errors found in background save thread=async_save. I0423 16:42:29.483873 138589929203456 checkpoint_manager.py:2133] [process=6][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts... I0423 16:42:29.486645 138589929203456 checkpoint_manager.py:2142] [process=6][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts. I0423 16:42:29.486838 138716912764736 checkpoint_manager.py:2036] [process=6][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9. I0423 16:42:29.486981 138716912764736 checkpoint_manager.py:2013] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning. I0423 16:42:29.487637 138716912764736 metric_logger.py:185] completed step: 9, seconds: 0.136, TFLOP/s/device: 99.686, Tokens/s/device: 15025.789, total_weights: 65536, loss: 8.188 Per train step: Total TFLOPs: 13.59 split as 93.93% learnable weight flops and 6.07% attention flops XPK End: Thu Apr 23 16:42:43 UTC 2026 EXIT_CODE=0