2026-04-15 21:48:55.642447: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0415 21:48:55.759625 124894992567424 max_utils.py:238] Skipping jax distributed system due to skip_jax_distributed_system=True flag.
I0415 21:49:44.978551 124894992567424 max_utils.py:800] System Information: Jax Version: 0.9.2
I0415 21:49:44.978664 124894992567424 max_utils.py:801] System Information: Jaxlib Version: 0.9.2
I0415 21:49:44.978698 124894992567424 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Apr 6 2026 20:48:10 (1775533690) cl/895581894
I0415 21:49:44.978725 124894992567424 train_utils.py:334] WARNING: 'dataset_path' might be pointing your local file system
I0415 21:49:44.978746 124894992567424 train_utils.py:347] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
I0415 21:49:44.978820 124894992567424 train.py:683] [DECOUPLED NO-OP] skipping cloud diagnostics wrapper.
W0415 21:49:45.069010 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0415 21:49:45.474431 124894992567424 maxtext_utils.py:1517] Num_devices: 8, shape (1, 1, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0415 21:49:45.474729 124894992567424 checkpointing.py:677] Setting up checkpoint logger...
I0415 21:49:45.474779 124894992567424 checkpointing.py:233] Creating checkpoint manager with ocdbt=True and zarr3=True
I0415 21:49:45.474821 124894992567424 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0415 21:49:45.475289 124894992567424 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7196caffbb00>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0415 21:49:47.850526 124894992567424 checkpointing.py:265] Enabling policy for fixed interval checkpointing.
I0415 21:49:47.850708 124894992567424 checkpoint_manager.py:702] [process=0][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=('items',), item_handlers={'items': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x71904f139c70>}, handler_registry=None
I0415 21:49:47.850972 124894992567424 composite_checkpoint_handler.py:237] Deferred registration for item: "items". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x71904f139c70>` for item "items" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0415 21:49:47.851011 124894992567424 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x71904f139fd0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0415 21:49:47.851057 124894992567424 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x71904f139c70>, ('items', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x71904f139c70>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x71904f139fd0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x71904f139fd0>}).
I0415 21:49:47.851359 124894992567424 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
I0415 21:49:47.851413 124894992567424 async_checkpointer.py:177] [process=0][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>.<lambda> at 0x71904f12d300> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0415 21:49:47.981633 124894992567424 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints
I0415 21:49:47.981855 124894992567424 checkpoint_manager.py:921] [process=0][thread=MainThread] CheckpointManager created, primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=1, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=FixedIntervalPolicy(interval=10), preservation_policy=LatestN(n=None), prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x71904f139f40>
I0415 21:49:47.981938 124894992567424 checkpointing.py:301] Checkpoint manager created!
W0415 21:49:47.998241 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0415 21:49:48.189872 124894992567424 nnx_wrappers.py:437] Unknown Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_norm_length', 'activation_embed').
I0415 21:49:48.189957 124894992567424 nnx_wrappers.py:437] Unknown Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0415 21:49:48.278702 124894992567424 attentions.py:1088] attentions/inputs_q Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0415 21:49:48.278778 124894992567424 attentions.py:1088] attentions/inputs_q Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0415 21:49:48.292844 124894992567424 attentions.py:1089] attentions/inputs_kv Logical: bfloat16[8,2048,2048]....................................... ('activation_batch', 'activation_attn_length', 'activation_attn_embed').
I0415 21:49:48.292895 124894992567424 attentions.py:1089] attentions/inputs_kv Physical: bfloat16[8,2048,2048]....................................... ('fsdp', None, None).
I0415 21:49:48.319498 124894992567424 attentions.py:1154] attentions/query Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0415 21:49:48.319554 124894992567424 attentions.py:1154] attentions/query Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0415 21:49:48.333798 124894992567424 attentions.py:1155] attentions/key Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0415 21:49:48.333846 124894992567424 attentions.py:1155] attentions/key Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0415 21:49:48.347950 124894992567424 attentions.py:1156] attentions/value Logical: bfloat16[8,2048,16,128]..................................... ('activation_kv_batch', 'activation_attn_length', 'activation_kv_heads', 'activation_kv_head_dim').
I0415 21:49:48.347999 124894992567424 attentions.py:1156] attentions/value Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0415 21:49:48.374417 124894992567424 attentions.py:1197] attentions/out Logical: bfloat16[8,2048,16,128]..................................... ('activation_batch', 'activation_attn_length', 'activation_heads', 'activation_kv').
I0415 21:49:48.374473 124894992567424 attentions.py:1197] attentions/out Physical: bfloat16[8,2048,16,128]..................................... ('fsdp', None, None, None).
I0415 21:49:48.393578 124894992567424 linears.py:525] linears/x Logical: bfloat16[8,2048,7168]....................................... ('activation_batch', 'activation_length', 'activation_mlp').
I0415 21:49:48.393628 124894992567424 linears.py:525] linears/x Physical: bfloat16[8,2048,7168]....................................... ('fsdp', None, None).
I0415 21:49:50.935160 124894992567424 checkpointing.py:577] checkpoint manager exists so trying to load this run's existing checkpoint
I0415 21:49:50.935352 124894992567424 checkpointing.py:665] No existing checkpoints found, not restoring checkpoint.
W0415 21:49:51.502747 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
[DECOUPLED NO-OP] gcs_storage: using stubs.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] mldiagnostics: using stub.
[DECOUPLED NO-OP] workload_monitor: using stub.
[DECOUPLED NO-OP] vertex_tensorboard: using stub.
fsdp: 8
I0415 21:49:54.275115 124894992567424 maxtext_utils.py:1620] params/params/decoder/decoder_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275229 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275269 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275312 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.275341 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275365 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275404 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.275443 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.275470 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.275495 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_0/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.275517 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275541 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275564 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.275587 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275609 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275632 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.275655 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.275678 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.275703 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_1/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.275726 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275748 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275770 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.275791 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275812 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.275835 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.275862 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.275885 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.275907 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_10/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.275928 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275954 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.275976 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.275997 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276017 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276040 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276079 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.276102 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.276124 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_11/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276147 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276169 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276190 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.276211 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276232 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276257 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276280 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.276302 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.276324 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_12/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276346 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276367 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276389 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.276410 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276431 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276456 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276478 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.276500 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.276521 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_13/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276542 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276563 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276585 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.276605 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276626 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276648 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276671 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.276693 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.276714 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_14/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276736 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276758 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276780 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.276801 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276822 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.276847 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276869 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.276892 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.276914 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_15/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.276935 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276957 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.276978 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.276998 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277019 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277041 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277076 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.277099 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.277122 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_2/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277143 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277165 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277186 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.277206 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277226 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277248 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277270 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.277291 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.277312 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_3/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277334 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277356 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277378 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.277399 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277419 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277441 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277463 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.277485 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.277507 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_4/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277527 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277549 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277570 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.277605 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277626 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277648 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277670 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.277691 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.277712 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_5/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277733 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277755 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277776 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.277796 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277815 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.277836 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277863 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.277885 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.277907 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_6/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.277929 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277950 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.277971 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.277992 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.278011 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.278033 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.278067 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.278090 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.278112 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_7/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.278133 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.278154 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.278175 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.278196 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.278216 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.278238 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.278259 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.278281 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.278302 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_8/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.278324 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/mlp/wi_0/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.278344 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/mlp/wi_1/kernel
Shape: float32[2048,7168]
Logical: P('embed', 'mlp')
Physical: ('fsdp', None)
I0415 21:49:54.278365 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/mlp/wo/kernel
Shape: float32[7168,2048]
Logical: P('mlp', 'embed')
Physical: (None, 'fsdp')
I0415 21:49:54.278385 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/post_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.278405 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/pre_self_attention_layer_norm/scale
Shape: float32[2048]
Logical: P('norm',)
Physical: (None,)
I0415 21:49:54.278426 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/self_attention/key/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.278448 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/self_attention/out/kernel
Shape: float32[16,128,2048]
Logical: P('heads', 'kv', 'embed')
Physical: (None, None, 'fsdp')
I0415 21:49:54.278469 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/self_attention/query/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'q_heads', 'kv')
Physical: ('fsdp', None, None)
I0415 21:49:54.278491 124894992567424 maxtext_utils.py:1620] params/params/decoder/layers_9/self_attention/value/kernel
Shape: float32[2048,16,128]
Logical: P('embed', 'kv_heads', 'kv_head_dim')
Physical: ('fsdp', None, None)
I0415 21:49:54.278527 124894992567424 maxtext_utils.py:1620] params/params/decoder/logits_dense/kernel
Shape: float32[2048,32000]
Logical: P('embed_vocab', 'vocab')
Physical: ('fsdp', None)
I0415 21:49:54.278560 124894992567424 maxtext_utils.py:1620] params/params/token_embedder/embedding
Shape: float32[32000,2048]
Logical: P('vocab', 'embed_vocab')
Physical: (None, 'fsdp')
I0415 21:49:57.923521 124894992567424 train.py:155] train/xent Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0415 21:49:57.923613 124894992567424 train.py:155] train/xent Physical: float32[8,2048]............................................. ('fsdp', None).
I0415 21:49:57.937280 124894992567424 train.py:162] train/z_loss Logical: float32[8,2048]............................................. ('activation_embed_and_logits_batch', 'activation_length').
I0415 21:49:57.937329 124894992567424 train.py:162] train/z_loss Physical: float32[8,2048]............................................. ('fsdp', None).
W0415 21:50:00.575251 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0415 21:50:02.838402 124894992567424 max_utils.py:791] Total memory size: 3.3 GB, Output size: 1.5 GB, Temp size: 1.8 GB, Argument size: 1.5 GB, Host temp size: 0.0 GB.
I0415 21:50:02.839082 124894992567424 max_utils.py:194] tensorboardX not available; using no-op SummaryWriter.
I0415 21:50:02.839543 124894992567424 metric_logger.py:289] number parameters: 1.104 billion
W0415 21:50:08.533270 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0415 21:50:10.245214 124894992567424 checkpointing.py:772] Waiting for step 0 to finish before checkpoint...
I0415 21:50:10.349863 124894992567424 checkpointing.py:776] Waited 0.1046302318572998 seconds for step 0 to finish before starting checkpointing.
I0415 21:50:10.350810 124894992567424 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0415 21:50:10.350992 124894992567424 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 0
I0415 21:50:10.351374 124894992567424 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0.
I0415 21:50:10.463138 124894992567424 signaling_client.py:373] Using ThreadSafeKeyValueSignalingClient
I0415 21:50:10.556849 124787563169344 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0
I0415 21:50:10.596805 124894992567424 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0415 21:50:10.596909 124894992567424 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0415 21:50:11.805311 124894992567424 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 1.209297s
I0415 21:50:11.805716 124894992567424 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 9.200 GiB/s (total gbytes: 12.3 GiB) (time elapsed: a second) (per-host)
I0415 21:50:11.805768 124894992567424 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 1.341420s (batch_requests_ready=0.106280s, total_serialization_initiated=1.234820s, others=0.000319s)
I0415 21:50:11.805864 124894992567424 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 1.342268s (all_items=0.000019s, per_item={'items': '0.00001907'}, temp_paths=1.342249)
I0415 21:50:11.806503 124787491866176 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0415 21:50:11.806602 124894992567424 async_checkpointer.py:561] Finished blocking save. Time taken: 1.455565s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0.
I0415 21:50:11.806781 124894992567424 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=0] Starting CheckpointManager Save Finalize thread=save_finalize
I0415 21:50:11.806954 124894992567424 standard_logger.py:34] {'step': 0, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776289810.3507974, 'wait_for_prev_duration_secs': 4.38690185546875e-05, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776289810.3510141, 'checkpointer_blocking_duration_secs': 1.4556972980499268, 'get_old_steps_start_time': 1776289811.8067272, 'get_old_steps_duration_secs': 1.9788742065429688e-05, 'checkpoint_manager_blocking_start_time': 1776289810.3507144, 'checkpoint_manager_blocking_duration_secs': 1.4562180042266846}
I0415 21:50:11.807083 124894992567424 checkpointing.py:408] Started an asynchronous checkpoint save for step 0
I0415 21:50:11.807123 124894992567424 max_utils.py:750]
Memstats: After params initialized:
I0415 21:50:11.807399 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_0(process=0,(0,0,0,0))
I0415 21:50:11.807428 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_1(process=0,(1,0,0,0))
I0415 21:50:11.807446 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_2(process=0,(0,1,0,0))
I0415 21:50:11.807464 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_3(process=0,(1,1,0,0))
I0415 21:50:11.807480 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_4(process=0,(0,2,0,0))
I0415 21:50:11.807497 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_5(process=0,(1,2,0,0))
I0415 21:50:11.807512 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_6(process=0,(0,3,0,0))
I0415 21:50:11.807528 124894992567424 max_utils.py:756] Using (GB) 2.16 / 31.25 (6.912000%) on TPU_7(process=0,(1,3,0,0))
W0415 21:50:11.812810 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0415 21:50:11.817603 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
W0415 21:50:11.821767 3267316 pjrt_executable.cc:642] Assume version compatibility. PjRt-IFRT does not track XLA executable versions.
I0415 21:50:11.828190 124787552683584 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/items
I0415 21:50:11.970125 124787512837696 checkpoint.py:188] Wrote Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776289811727979071, 'commit_timestamp_nsecs': None, 'custom_metadata': {}}, json={"item_handlers": null, "metrics": {}, "performance_metrics": {}, "init_timestamp_nsecs": 1776289811727979071, "commit_timestamp_nsecs": null, "custom_metadata": {}} to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0415 21:50:11.970329 124787481380416 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0415 21:50:12.130422 124894992567424 metric_logger.py:185] completed step: 0, seconds: 7.405, TFLOP/s/device: 1.835, Tokens/s/device: 276.564, total_weights: 16384, loss: 10.887
I0415 21:50:12.131294 124894992567424 metric_logger.py:269] To see full metrics 'tensorboard --logdir=gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/tensorboard/'
I0415 21:50:12.220360 124894992567424 metric_logger.py:185] completed step: 1, seconds: 1.869, TFLOP/s/device: 7.269, Tokens/s/device: 1095.627, total_weights: 16384, loss: 10.887
I0415 21:50:12.336530 124894992567424 metric_logger.py:185] completed step: 2, seconds: 0.024, TFLOP/s/device: 572.837, Tokens/s/device: 86344.281, total_weights: 16384, loss: 9.787
I0415 21:50:12.348707 3269611 google_auth_provider.cc:149] Using credentials at ~/.config/gcloud/application_default_credentials.json
I0415 21:50:12.348757 3269611 google_auth_provider.cc:156] Using OAuth2 AuthProvider
I0415 21:50:12.439584 124894992567424 metric_logger.py:185] completed step: 3, seconds: 0.092, TFLOP/s/device: 148.355, Tokens/s/device: 22361.740, total_weights: 16384, loss: 8.853
I0415 21:50:13.119715 124787563169344 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/items/array_metadatas/process_0
I0415 21:50:32.925480 124894992567424 metric_logger.py:185] completed step: 4, seconds: 0.156, TFLOP/s/device: 86.834, Tokens/s/device: 13088.604, total_weights: 16384, loss: 7.982
I0415 21:50:32.940330 124894992567424 metric_logger.py:185] completed step: 5, seconds: 0.075, TFLOP/s/device: 180.964, Tokens/s/device: 27276.844, total_weights: 16384, loss: 7.248
I0415 21:50:33.041479 124894992567424 metric_logger.py:185] completed step: 6, seconds: 20.474, TFLOP/s/device: 0.664, Tokens/s/device: 100.030, total_weights: 16384, loss: 6.690
I0415 21:50:33.151652 124894992567424 metric_logger.py:185] completed step: 7, seconds: 0.013, TFLOP/s/device: 1085.494, Tokens/s/device: 163617.480, total_weights: 16384, loss: 6.307
I0415 21:50:33.261787 124894992567424 metric_logger.py:185] completed step: 8, seconds: 0.102, TFLOP/s/device: 133.454, Tokens/s/device: 20115.705, total_weights: 16384, loss: 6.070
I0415 21:50:33.372144 124894992567424 checkpointing.py:772] Waiting for step 9 to finish before checkpoint...
I0415 21:50:33.377636 124894992567424 checkpointing.py:776] Waited 0.005510807037353516 seconds for step 9 to finish before starting checkpointing.
I0415 21:50:33.378263 124894992567424 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=0][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0415 21:50:43.421751 124787502351936 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 31.103040s (commit=30.659632s, array_metadata_write=0.443408s)
I0415 21:50:43.423428 124787491866176 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 383.409 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 32 seconds) (per-host)
I0415 21:50:43.423484 124787491866176 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 31.616837s.
I0415 21:50:43.654640 124787491866176 checkpoint.py:228] Read Metadata={'item_handlers': None, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776289811727979071, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} from gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0415 21:50:43.836718 124787491866176 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0415 21:50:44.034751 124787512837696 checkpoint.py:247] Updated Metadata={'item_handlers': {'items': 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler'}, 'metrics': {}, 'performance_metrics': {}, 'init_timestamp_nsecs': 1776289811727979071, 'commit_timestamp_nsecs': None, 'custom_metadata': {}} to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/_CHECKPOINT_METADATA
I0415 21:50:44.279023 124787491866176 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0415 21:50:44.279905 124787491866176 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.583613s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/items
I0415 21:50:44.280686 124787491866176 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0/items
I0415 21:50:44.516966 124787491866176 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0
I0415 21:50:45.192487 124787491866176 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0`.
I0415 21:50:45.193150 124787491866176 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 34.842123s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/0
I0415 21:50:45.193228 124787491866176 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 33.386582s.
I0415 21:50:45.193338 124787481380416 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0415 21:50:45.193389 124787481380416 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0415 21:50:45.193436 124787481380416 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is syncing with other hosts...
I0415 21:50:45.193480 124787481380416 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=0] CheckpointManager Save Finalize is done on all hosts.
I0415 21:50:45.193603 124894992567424 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=0][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=0.
W0415 21:50:45.193720 124894992567424 checkpoint_manager.py:1441] Waiting for previous save to complete took 11.815462 seconds. If this number is high, consider checkpointing less frequently.
I0415 21:50:45.194370 124894992567424 checkpoint_manager.py:1501] [process=0] Saving checkpoint at step 9
I0415 21:50:45.194654 124894992567424 async_checkpointer.py:452] [process=0] Started async saving checkpoint to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9.
I0415 21:50:45.410083 124787481380416 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9
I0415 21:50:45.410484 124894992567424 jax_array_handlers.py:347] Scheduling D2H of 444 prioritized jax.Array.
I0415 21:50:45.411151 124894992567424 replica_slices.py:410] Transferring arrays to host memory with options: use_replica_parallel=True, min_slice_bytes_for_replica_parallel=None, max_replicas_for_replica_parallel=None, enable_pinned_host_transfer=False
I0415 21:50:45.634973 124894992567424 base_pytree_checkpoint_handler.py:153] [process=0][thread=MainThread] Initiated "orbax.checkpoint._src.serialization.jax_array_handlers.ArrayHandler".serialize. Time taken: 0.225471s
I0415 21:50:45.635418 124894992567424 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/blocking_gbytes_per_sec: 35.735 GiB/s (total gbytes: 12.3 GiB) (time elapsed: 345 milliseconds) (per-host)
I0415 21:50:45.635476 124894992567424 base_pytree_checkpoint_handler.py:732] [process=0][thread=MainThread] Initiated Pytree async_save. Time taken: 0.345435s (batch_requests_ready=0.091360s, total_serialization_initiated=0.253730s, others=0.000345s)
I0415 21:50:45.635582 124894992567424 composite_checkpoint_handler.py:715] [process=0][thread=MainThread] Initiated CompositeCheckpointHandler.async_save. Time taken: 0.345999s (all_items=0.000015s, per_item={'items': '0.00001478'}, temp_paths=0.345984)
I0415 21:50:45.636321 124787439437376 async_checkpointer.py:79] [process=0][thread=async_save] Background save thread started.
I0415 21:50:45.636418 124894992567424 async_checkpointer.py:561] Finished blocking save. Time taken: 0.442002s. Continuing background save to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9.
I0415 21:50:45.636586 124894992567424 checkpoint_manager.py:1549] [process=0][thread=MainThread][step=9] Starting CheckpointManager Save Finalize thread=save_finalize
I0415 21:50:45.636850 124787428951616 async_checkpointer.py:265] [process=0][thread=save_finalize] Waiting for background save thread=async_save.
I0415 21:50:45.636977 124894992567424 standard_logger.py:34] {'step': 9, 'event_type': 'save', 'directory': 'gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints', 'reached_preemption': False, 'preemption_received_at': None, 'synchronous': False, 'wait_for_prev_start_time': 1776289833.3782353, 'wait_for_prev_duration_secs': 11.815462350845337, 'time_between_consecutive_saves_sec': None, 'checkpointer_blocking_start_time': 1776289845.1943972, 'checkpointer_blocking_duration_secs': 0.44211530685424805, 'get_old_steps_start_time': 1776289845.636527, 'get_old_steps_duration_secs': 2.1696090698242188e-05, 'checkpoint_manager_blocking_start_time': 1776289833.3781884, 'checkpoint_manager_blocking_duration_secs': 12.258760213851929}
I0415 21:50:45.637431 124894992567424 checkpointing.py:408] Started an asynchronous checkpoint save for step 9
I0415 21:50:45.637468 124894992567424 checkpoint_manager.py:1994] [process=0][thread=MainThread][step=9][wait_until_finished] Waiting for Save Finalize thread (save_finalize) to complete.
I0415 21:50:46.127546 124787552683584 atomicity.py:137] Creating tmp directory gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9/items
I0415 21:50:47.389734 124787466700352 array_metadata_store.py:203] [process=0][thread=array_type_handler] Wrote 444 array_metadata.ArrayMetadata to gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9/items/array_metadatas/process_0
I0415 21:51:19.609436 124787449923136 base_pytree_checkpoint_handler.py:1217] [process=0][thread=write_metadata_after_commits] Commit + Array metadata written. Time taken: 33.025169s (commit=32.562136s, array_metadata_write=0.463033s)
I0415 21:51:19.611169 124787439437376 base_pytree_checkpoint_handler.py:128] [process=0] /jax/checkpoint/write/gbytes_per_sec: 368.193 MiB/s (total gbytes: 12.3 GiB) (time elapsed: 34 seconds) (per-host)
I0415 21:51:19.611273 124787439437376 async_checkpointer.py:90] [process=0][thread=async_save] 3 Handler Commit operations completed. Time taken: 33.974811s.
I0415 21:51:20.007341 124787439437376 array_metadata_store.py:367] [process=0][thread=async_save] Skipped cross-host ArrayMetadata validation because only one process is found: process_index=0.
I0415 21:51:20.469007 124787439437376 ocdbt_utils.py:56] Param validation support for Zarr3 will be added later (b/362328389).
I0415 21:51:20.469691 124787439437376 base_pytree_checkpoint_handler.py:1342] [process=0][thread=async_save] Pytree save finalize (merge_ocdbt + ArrayMetadata validation) completed. Time taken: 0.605511s. use_zarr3=True, enable_post_merge_validation=True, directory=gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9/items
I0415 21:51:20.470317 124787439437376 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9/items
I0415 21:51:20.715115 124787439437376 atomicity.py:608] Finalizing gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9
I0415 21:51:21.491407 124787439437376 atomicity.py:794] [process=0][thread=async_save] Finished saving checkpoint (finalized tmp dir) to `gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9`.
I0415 21:51:21.492067 124787439437376 async_checkpointer.py:420] Finished async_save (blocking + background). Time taken: 36.297657s. directory=gs://wanglance-maxtext/linen_ckpt_main_20260415_211126/linen_main_20260415_211126_13_scan_layers_false/checkpoints/9
I0415 21:51:21.492140 124787439437376 async_checkpointer.py:144] [process=0][thread=async_save] Background save thread done. Time taken: 35.855679s.
I0415 21:51:21.492291 124787428951616 async_checkpointer.py:273] [process=0][thread=save_finalize] Done with waiting for background save thread=async_save.
I0415 21:51:21.492390 124787428951616 async_checkpointer.py:283] [process=0][thread=save_finalize] No errors found in background save thread=async_save.
I0415 21:51:21.492443 124787428951616 checkpoint_manager.py:2103] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is syncing with other hosts...
I0415 21:51:21.492482 124787428951616 checkpoint_manager.py:2112] [process=0][thread=save_finalize][step=9] CheckpointManager Save Finalize is done on all hosts.
I0415 21:51:21.492616 124894992567424 checkpoint_manager.py:2006] [process=0][thread=MainThread][step=9][wait_until_finished] Done waiting for Save Finalize thread (save_finalize) running at step=9.
I0415 21:51:21.492738 124894992567424 checkpoint_manager.py:1983] [process=0][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0415 21:51:21.493260 124894992567424 metric_logger.py:185] completed step: 9, seconds: 0.110, TFLOP/s/device: 123.249, Tokens/s/device: 18577.480, total_weights: 16384, loss: 5.930
Per train step:
Total TFLOPs: 13.59
split as 93.93% learnable weight flops and 6.07% attention flops