MaxView

← Back to run

Log Summary

XPK Start: Fri Apr 24 09:21:07 UTC 2026
2026-04-24 09:21:24.536987: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
`rope_scaling`'s factor field must be a float >= 1, got 40
`rope_scaling`'s beta_fast field must be a float, got 32
`rope_scaling`'s beta_slow field must be a float, got 1
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
I0424 09:21:28.514599 133324340254528 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-24 09:21:37,553:jax._src.distributed:149: Starting JAX distributed service on [::]:8482
I0424 09:21:37.553454 133324340254528 distributed.py:149] Starting JAX distributed service on [::]:8482
INFO:2026-04-24 09:21:37,555:jax._src.distributed:166: Connecting to JAX distributed service on mt-07-distill-smoke-d724i-slice-job-0-0.mt-07-distill-smoke-d724i:8482
I0424 09:21:37.555848 133324340254528 distributed.py:166] Connecting to JAX distributed service on mt-07-distill-smoke-d724i-slice-job-0-0.mt-07-distill-smoke-d724i:8482
I0424 09:21:39.239712 133324340254528 max_utils.py:284] Jax distributed system initialized!
I0424 09:21:45.998522 133324340254528 max_utils.py:244] Jax distributed system is already initialized.
W0424 09:21:46.130823 133324340254528 pyconfig.py:111] base_output_directory is not provided; Using local directory called maxtext_output
I0424 09:21:46.191020 133324340254528 max_utils.py:244] Jax distributed system is already initialized.
I0424 09:21:46.192203 133324340254528 pyconfig.py:471] Config param abort_on_inf_loss: True
I0424 09:21:46.192251 133324340254528 pyconfig.py:471] Config param abort_on_nan_loss: True
I0424 09:21:46.192280 133324340254528 pyconfig.py:471] Config param act_quantization_calibration_method: absmax
I0424 09:21:46.192304 133324340254528 pyconfig.py:471] Config param activation_dropout_for_audio: 0.0
I0424 09:21:46.192325 133324340254528 pyconfig.py:471] Config param activation_function_for_audio: gelu
I0424 09:21:46.192345 133324340254528 pyconfig.py:471] Config param activations_in_float32: False
I0424 09:21:46.192365 133324340254528 pyconfig.py:471] Config param adam_b1: 0.9
I0424 09:21:46.192389 133324340254528 pyconfig.py:471] Config param adam_b2: 0.95
I0424 09:21:46.192405 133324340254528 pyconfig.py:471] Config param adam_eps: 1e-08
I0424 09:21:46.192428 133324340254528 pyconfig.py:471] Config param adam_eps_root: 0.0
I0424 09:21:46.192443 133324340254528 pyconfig.py:471] Config param adam_weight_decay: 0.1
I0424 09:21:46.192461 133324340254528 pyconfig.py:471] Config param adamw_mask: []
I0424 09:21:46.192477 133324340254528 pyconfig.py:471] Config param add_bos: True
I0424 09:21:46.192495 133324340254528 pyconfig.py:471] Config param add_eos: True
I0424 09:21:46.192510 133324340254528 pyconfig.py:471] Config param allow_split_physical_axes: False
I0424 09:21:46.192527 133324340254528 pyconfig.py:471] Config param ar_cache_axis_order: 1,2,0,3
I0424 09:21:46.192544 133324340254528 pyconfig.py:471] Config param async_checkpointing: True
I0424 09:21:46.192560 133324340254528 pyconfig.py:471] Config param async_scheduling: False
I0424 09:21:46.192575 133324340254528 pyconfig.py:471] Config param attention: dot_product
I0424 09:21:46.192591 133324340254528 pyconfig.py:471] Config param attention_bias: False
I0424 09:21:46.192607 133324340254528 pyconfig.py:471] Config param attention_dropout_for_audio: 0.0
I0424 09:21:46.192622 133324340254528 pyconfig.py:471] Config param attention_out: RematLocation.REMAT
I0424 09:21:46.192643 133324340254528 pyconfig.py:471] Config param attention_output_dim: -1
I0424 09:21:46.192669 133324340254528 pyconfig.py:471] Config param attention_sink: False
I0424 09:21:46.192685 133324340254528 pyconfig.py:471] Config param attention_type: global
I0424 09:21:46.192700 133324340254528 pyconfig.py:471] Config param attn_logits_soft_cap: None
I0424 09:21:46.192717 133324340254528 pyconfig.py:471] Config param audio_path: 
I0424 09:21:46.192732 133324340254528 pyconfig.py:471] Config param audio_placeholder: <|audio|>
I0424 09:21:46.192749 133324340254528 pyconfig.py:471] Config param autoregressive_decode_assert: 
I0424 09:21:46.192765 133324340254528 pyconfig.py:471] Config param base_config: base.yml
I0424 09:21:46.192780 133324340254528 pyconfig.py:471] Config param base_emb_dim: 16
I0424 09:21:46.192796 133324340254528 pyconfig.py:471] Config param base_mlp_dim: 64
I0424 09:21:46.192813 133324340254528 pyconfig.py:471] Config param base_moe_mlp_dim: -1
I0424 09:21:46.192828 133324340254528 pyconfig.py:471] Config param base_num_decoder_layers: 1
I0424 09:21:46.192845 133324340254528 pyconfig.py:471] Config param base_num_kv_heads: 2
I0424 09:21:46.192861 133324340254528 pyconfig.py:471] Config param base_num_query_heads: 2
I0424 09:21:46.192876 133324340254528 pyconfig.py:471] Config param base_output_directory: /deps/maxtext_output
I0424 09:21:46.192892 133324340254528 pyconfig.py:471] Config param batch_size: 1
I0424 09:21:46.192909 133324340254528 pyconfig.py:471] Config param batch_split_factor: 1
I0424 09:21:46.192924 133324340254528 pyconfig.py:471] Config param beta_fast: 32
I0424 09:21:46.192944 133324340254528 pyconfig.py:471] Config param beta_slow: 1
I0424 09:21:46.192960 133324340254528 pyconfig.py:471] Config param bwd_quantization_calibration_method: absmax
I0424 09:21:46.192977 133324340254528 pyconfig.py:471] Config param capacity_factor: -1.0
I0424 09:21:46.192994 133324340254528 pyconfig.py:471] Config param cast_logits_to_fp32: True
I0424 09:21:46.193011 133324340254528 pyconfig.py:471] Config param chat_template: 
I0424 09:21:46.193026 133324340254528 pyconfig.py:471] Config param chat_template_path: 
I0424 09:21:46.193042 133324340254528 pyconfig.py:471] Config param checkpoint_conversion_fn: None
I0424 09:21:46.193060 133324340254528 pyconfig.py:471] Config param checkpoint_dir: /deps/maxtext_output/gpt3-52k_2026-04-24-09-21/checkpoints/
I0424 09:21:46.193075 133324340254528 pyconfig.py:471] Config param checkpoint_is_quantized: False
I0424 09:21:46.193092 133324340254528 pyconfig.py:471] Config param checkpoint_period: 2000
I0424 09:21:46.193109 133324340254528 pyconfig.py:471] Config param checkpoint_storage_concurrent_gb: 96
I0424 09:21:46.193124 133324340254528 pyconfig.py:471] Config param checkpoint_storage_target_data_file_size_bytes: 2147483648
I0424 09:21:46.193143 133324340254528 pyconfig.py:471] Config param checkpoint_storage_use_ocdbt: True
I0424 09:21:46.193158 133324340254528 pyconfig.py:471] Config param checkpoint_storage_use_zarr3: True
I0424 09:21:46.193174 133324340254528 pyconfig.py:471] Config param checkpoint_todelete_full_path: None
I0424 09:21:46.193190 133324340254528 pyconfig.py:471] Config param checkpoint_todelete_subdir: None
I0424 09:21:46.193207 133324340254528 pyconfig.py:471] Config param chips_per_vm: 4
I0424 09:21:46.193222 133324340254528 pyconfig.py:471] Config param chunk_attn_window_size: 0
I0424 09:21:46.193238 133324340254528 pyconfig.py:471] Config param collect_stack_trace: False
I0424 09:21:46.193254 133324340254528 pyconfig.py:471] Config param colocated_python_checkpointing: False
I0424 09:21:46.193270 133324340254528 pyconfig.py:471] Config param colocated_python_data_input: False
I0424 09:21:46.193285 133324340254528 pyconfig.py:471] Config param compile_topology: 
I0424 09:21:46.193301 133324340254528 pyconfig.py:471] Config param compile_topology_num_slices: -1
I0424 09:21:46.193317 133324340254528 pyconfig.py:471] Config param compile_xla_flags: 
I0424 09:21:46.193331 133324340254528 pyconfig.py:471] Config param compiled_trainstep_file: 
I0424 09:21:46.193347 133324340254528 pyconfig.py:471] Config param compute_axis_order: 0,1,2,3
I0424 09:21:46.193363 133324340254528 pyconfig.py:471] Config param constant_bound_config: []
I0424 09:21:46.193379 133324340254528 pyconfig.py:471] Config param context: RematLocation.REMAT
I0424 09:21:46.193396 133324340254528 pyconfig.py:471] Config param context_parallel_load_balance: True
I0424 09:21:46.193411 133324340254528 pyconfig.py:471] Config param context_parallel_reorder_strategy: ReorderStrategy.AUTO
I0424 09:21:46.193429 133324340254528 pyconfig.py:471] Config param context_parallel_size: 1
I0424 09:21:46.193445 133324340254528 pyconfig.py:471] Config param context_parallel_strategy: all_gather
I0424 09:21:46.193460 133324340254528 pyconfig.py:471] Config param context_sharding: context
I0424 09:21:46.193476 133324340254528 pyconfig.py:471] Config param conv_chunksize_for_audio: 500
I0424 09:21:46.193492 133324340254528 pyconfig.py:471] Config param conv_stride_for_vit: 14
I0424 09:21:46.193506 133324340254528 pyconfig.py:471] Config param convert_checkpoint_if_possible: False
I0424 09:21:46.193522 133324340254528 pyconfig.py:471] Config param cost_estimate_flops_bwd: -1
I0424 09:21:46.193538 133324340254528 pyconfig.py:471] Config param cost_estimate_flops_fwd: -1
I0424 09:21:46.193554 133324340254528 pyconfig.py:471] Config param custom_mesh: 
I0424 09:21:46.193569 133324340254528 pyconfig.py:471] Config param custom_mesh_and_rule: 
I0424 09:21:46.193585 133324340254528 pyconfig.py:471] Config param d_model_for_audio: 256
I0424 09:21:46.193599 133324340254528 pyconfig.py:471] Config param data_sharding: (('data', 'stage', 'fsdp', 'fsdp_transpose', 'sequence', 'context', 'context_autoregressive', 'tensor', 'tensor_transpose', 'tensor_sequence', 'expert', 'autoregressive'),)
I0424 09:21:46.193619 133324340254528 pyconfig.py:471] Config param data_shuffle_seed: 0
I0424 09:21:46.193635 133324340254528 pyconfig.py:471] Config param dataset_name: c4/en:3.0.1
I0424 09:21:46.193662 133324340254528 pyconfig.py:471] Config param dataset_path: 
I0424 09:21:46.193678 133324340254528 pyconfig.py:471] Config param dataset_type: DatasetType.HF
I0424 09:21:46.193695 133324340254528 pyconfig.py:471] Config param dcn_autoregressive_parallelism: 1
I0424 09:21:46.193710 133324340254528 pyconfig.py:471] Config param dcn_context_autoregressive_parallelism: 1
I0424 09:21:46.193725 133324340254528 pyconfig.py:471] Config param dcn_context_parallelism: 1
I0424 09:21:46.193741 133324340254528 pyconfig.py:471] Config param dcn_data_parallelism: -1
I0424 09:21:46.193756 133324340254528 pyconfig.py:471] Config param dcn_diloco_parallelism: 1
I0424 09:21:46.193772 133324340254528 pyconfig.py:471] Config param dcn_expert_parallelism: 1
I0424 09:21:46.193789 133324340254528 pyconfig.py:471] Config param dcn_fsdp_parallelism: 1
I0424 09:21:46.193804 133324340254528 pyconfig.py:471] Config param dcn_fsdp_transpose_parallelism: 1
I0424 09:21:46.193821 133324340254528 pyconfig.py:471] Config param dcn_parallelism: [1, -1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
I0424 09:21:46.193838 133324340254528 pyconfig.py:471] Config param dcn_pipeline_parallelism: 1
I0424 09:21:46.193855 133324340254528 pyconfig.py:471] Config param dcn_sequence_parallelism: 1
I0424 09:21:46.193871 133324340254528 pyconfig.py:471] Config param dcn_tensor_parallelism: 1
I0424 09:21:46.193885 133324340254528 pyconfig.py:471] Config param dcn_tensor_sequence_parallelism: 1
I0424 09:21:46.193901 133324340254528 pyconfig.py:471] Config param dcn_tensor_transpose_parallelism: 1
I0424 09:21:46.193917 133324340254528 pyconfig.py:471] Config param debug: {'rl': False}
I0424 09:21:46.193937 133324340254528 pyconfig.py:471] Config param debug_sharding: False
I0424 09:21:46.193954 133324340254528 pyconfig.py:471] Config param decode_sampling_nucleus_p: -1
I0424 09:21:46.193970 133324340254528 pyconfig.py:471] Config param decode_sampling_strategy: SamplingStrategy.GREEDY
I0424 09:21:46.193987 133324340254528 pyconfig.py:471] Config param decode_sampling_temperature: 1.0
I0424 09:21:46.194003 133324340254528 pyconfig.py:471] Config param decode_sampling_top_k: 0
I0424 09:21:46.194019 133324340254528 pyconfig.py:471] Config param decoder_block: DecoderBlockType.GPT3
I0424 09:21:46.194036 133324340254528 pyconfig.py:471] Config param decoder_layer_input: RematLocation.DEVICE
I0424 09:21:46.194053 133324340254528 pyconfig.py:471] Config param deepstack_visual_indexes_for_vit: []
I0424 09:21:46.194086 133324340254528 pyconfig.py:471] Config param degenerate_group_masking: True
I0424 09:21:46.194101 133324340254528 pyconfig.py:471] Config param dense_init_scale: 1.0
I0424 09:21:46.194117 133324340254528 pyconfig.py:471] Config param diloco_outer_lr: 0.3
I0424 09:21:46.194133 133324340254528 pyconfig.py:471] Config param diloco_outer_momentum: 0.9
I0424 09:21:46.194149 133324340254528 pyconfig.py:471] Config param diloco_sync_period: 36
I0424 09:21:46.194165 133324340254528 pyconfig.py:471] Config param distill_alpha: 0.5
I0424 09:21:46.194180 133324340254528 pyconfig.py:471] Config param distill_alpha_end: None
I0424 09:21:46.194197 133324340254528 pyconfig.py:471] Config param distill_alpha_schedule: constant
I0424 09:21:46.194214 133324340254528 pyconfig.py:471] Config param distill_beta: 0.0
I0424 09:21:46.194229 133324340254528 pyconfig.py:471] Config param distill_beta_end: None
I0424 09:21:46.194245 133324340254528 pyconfig.py:471] Config param distill_beta_schedule: constant
I0424 09:21:46.194261 133324340254528 pyconfig.py:471] Config param distill_feature_loss_type: cosine
I0424 09:21:46.194277 133324340254528 pyconfig.py:471] Config param distill_layer_indices: None
I0424 09:21:46.194292 133324340254528 pyconfig.py:471] Config param distill_temperature: 1.0
I0424 09:21:46.194308 133324340254528 pyconfig.py:471] Config param distill_temperature_end: None
I0424 09:21:46.194323 133324340254528 pyconfig.py:471] Config param distill_temperature_schedule: constant
I0424 09:21:46.194337 133324340254528 pyconfig.py:471] Config param downsample_hidden_size_for_audio: 256
I0424 09:21:46.194353 133324340254528 pyconfig.py:471] Config param dpo_beta: 0.1
I0424 09:21:46.194369 133324340254528 pyconfig.py:471] Config param dpo_label_smoothing: 0.0
I0424 09:21:46.194384 133324340254528 pyconfig.py:471] Config param dq_reduction_steps: 0
I0424 09:21:46.194403 133324340254528 pyconfig.py:471] Config param dropout_rate: 0.0
I0424 09:21:46.194421 133324340254528 pyconfig.py:471] Config param dtype: bfloat16
I0424 09:21:46.194452 133324340254528 pyconfig.py:471] Config param dtype_mm: float32
I0424 09:21:46.194469 133324340254528 pyconfig.py:471] Config param dump_hlo: False
I0424 09:21:46.194484 133324340254528 pyconfig.py:471] Config param dump_hlo_delete_local_after: True
I0424 09:21:46.194501 133324340254528 pyconfig.py:471] Config param dump_hlo_gcs_dir: /deps/maxtext_output/gpt3-52k_2026-04-24-09-21/xla_dump
I0424 09:21:46.194516 133324340254528 pyconfig.py:471] Config param dump_hlo_local_dir: /tmp/xla_dump/
I0424 09:21:46.194532 133324340254528 pyconfig.py:471] Config param dump_hlo_local_module_name: jit_train_step
I0424 09:21:46.194548 133324340254528 pyconfig.py:471] Config param dump_hlo_module_name: jit_train_step
I0424 09:21:46.194564 133324340254528 pyconfig.py:471] Config param dump_hlo_upload_all: False
I0424 09:21:46.194580 133324340254528 pyconfig.py:471] Config param dump_hlo_xla_flags: 
I0424 09:21:46.194595 133324340254528 pyconfig.py:471] Config param dump_jaxpr: False
I0424 09:21:46.194611 133324340254528 pyconfig.py:471] Config param dump_jaxpr_delete_local_after: True
I0424 09:21:46.194626 133324340254528 pyconfig.py:471] Config param dump_jaxpr_gcs_dir: /deps/maxtext_output/gpt3-52k_2026-04-24-09-21/jaxpr_dump
I0424 09:21:46.194642 133324340254528 pyconfig.py:471] Config param dump_jaxpr_local_dir: /tmp/jaxpr_dump/
I0424 09:21:46.194670 133324340254528 pyconfig.py:471] Config param dump_step: -1
I0424 09:21:46.194687 133324340254528 pyconfig.py:471] Config param elastic_enabled: False
I0424 09:21:46.194702 133324340254528 pyconfig.py:471] Config param elastic_max_retries: 10
I0424 09:21:46.194718 133324340254528 pyconfig.py:471] Config param elastic_timeout_seconds: 300
I0424 09:21:46.194734 133324340254528 pyconfig.py:471] Config param emb_dim: 16
I0424 09:21:46.194750 133324340254528 pyconfig.py:471] Config param enable_autocheckpoint: False
I0424 09:21:46.194766 133324340254528 pyconfig.py:471] Config param enable_checkpoint_cloud_logger: False
I0424 09:21:46.194780 133324340254528 pyconfig.py:471] Config param enable_checkpointing: True
I0424 09:21:46.194796 133324340254528 pyconfig.py:471] Config param enable_continuous_checkpointing: False
I0424 09:21:46.194811 133324340254528 pyconfig.py:471] Config param enable_data_shuffling: True
I0424 09:21:46.194827 133324340254528 pyconfig.py:471] Config param enable_diloco: False
I0424 09:21:46.194843 133324340254528 pyconfig.py:471] Config param enable_dp_attention: False
I0424 09:21:46.194859 133324340254528 pyconfig.py:471] Config param enable_dropout: False
I0424 09:21:46.194874 133324340254528 pyconfig.py:471] Config param enable_emergency_checkpoint: False
I0424 09:21:46.194889 133324340254528 pyconfig.py:471] Config param enable_expert_parallel: False
I0424 09:21:46.194905 133324340254528 pyconfig.py:471] Config param enable_gcp_goodput_metrics: True
I0424 09:21:46.194937 133324340254528 pyconfig.py:471] Config param enable_gcp_step_deviation_metrics: True
I0424 09:21:46.194952 133324340254528 pyconfig.py:471] Config param enable_goodput_recording: False
I0424 09:21:46.194967 133324340254528 pyconfig.py:471] Config param enable_jax_profiler: False
I0424 09:21:46.194983 133324340254528 pyconfig.py:471] Config param enable_llm_inference_pool: False
I0424 09:21:46.194997 133324340254528 pyconfig.py:471] Config param enable_model_warmup: False
I0424 09:21:46.195013 133324340254528 pyconfig.py:471] Config param enable_multi_tier_checkpointing: False
I0424 09:21:46.195029 133324340254528 pyconfig.py:471] Config param enable_nnx: False
I0424 09:21:46.195045 133324340254528 pyconfig.py:471] Config param enable_orbax_v1: False
I0424 09:21:46.195060 133324340254528 pyconfig.py:471] Config param enable_padding_causal_mask: True
I0424 09:21:46.195075 133324340254528 pyconfig.py:471] Config param enable_pathways_goodput: False
I0424 09:21:46.195092 133324340254528 pyconfig.py:471] Config param enable_prefix_caching: False
I0424 09:21:46.195110 133324340254528 pyconfig.py:471] Config param enable_rampup_batch_size: False
I0424 09:21:46.195126 133324340254528 pyconfig.py:471] Config param enable_single_controller: False
I0424 09:21:46.195141 133324340254528 pyconfig.py:471] Config param enable_single_replica_ckpt_restoring: False
I0424 09:21:46.195156 133324340254528 pyconfig.py:471] Config param enable_tensorboard: True
I0424 09:21:46.195171 133324340254528 pyconfig.py:471] Config param enable_tunix_perf_metrics: False
I0424 09:21:46.195187 133324340254528 pyconfig.py:471] Config param encoder_attention_heads_for_audio: 4
I0424 09:21:46.195203 133324340254528 pyconfig.py:471] Config param encoder_ffn_dim_for_audio: 512
I0424 09:21:46.195219 133324340254528 pyconfig.py:471] Config param encoder_layers_for_audio: 2
I0424 09:21:46.195235 133324340254528 pyconfig.py:471] Config param engram: RematLocation.REMAT
I0424 09:21:46.195250 133324340254528 pyconfig.py:471] Config param engram_head_dim: 1280
I0424 09:21:46.195266 133324340254528 pyconfig.py:471] Config param engram_kernel_size: 4
I0424 09:21:46.195282 133324340254528 pyconfig.py:471] Config param engram_layers: []
I0424 09:21:46.195298 133324340254528 pyconfig.py:471] Config param engram_max_ngram_size: 3
I0424 09:21:46.195313 133324340254528 pyconfig.py:471] Config param engram_num_heads: 8
I0424 09:21:46.195329 133324340254528 pyconfig.py:471] Config param engram_seed: 0
I0424 09:21:46.195347 133324340254528 pyconfig.py:471] Config param engram_vocab_bases: []
I0424 09:21:46.195363 133324340254528 pyconfig.py:471] Config param epsilon_high: None
I0424 09:21:46.195378 133324340254528 pyconfig.py:471] Config param eval_corr_lst: False
I0424 09:21:46.195393 133324340254528 pyconfig.py:471] Config param eval_data_columns: ['text']
I0424 09:21:46.195409 133324340254528 pyconfig.py:471] Config param eval_dataset_name: c4/en:3.0.1
I0424 09:21:46.195425 133324340254528 pyconfig.py:471] Config param eval_image_column: image
I0424 09:21:46.195441 133324340254528 pyconfig.py:471] Config param eval_interval: -1
I0424 09:21:46.195456 133324340254528 pyconfig.py:471] Config param eval_make_lst: False
I0424 09:21:46.195472 133324340254528 pyconfig.py:471] Config param eval_per_device_batch_size: 2
I0424 09:21:46.195488 133324340254528 pyconfig.py:471] Config param eval_sampling_strategy: greedy
I0424 09:21:46.195504 133324340254528 pyconfig.py:471] Config param eval_split: validation
I0424 09:21:46.195520 133324340254528 pyconfig.py:471] Config param eval_steps: -1
I0424 09:21:46.195536 133324340254528 pyconfig.py:471] Config param expansion_factor_real_data: -1.0
I0424 09:21:46.195553 133324340254528 pyconfig.py:471] Config param final_logits_soft_cap: None
I0424 09:21:46.195569 133324340254528 pyconfig.py:471] Config param first_num_dense_layers: 0
I0424 09:21:46.195585 133324340254528 pyconfig.py:471] Config param float32_gate_logits: False
I0424 09:21:46.195601 133324340254528 pyconfig.py:471] Config param float32_logits: False
I0424 09:21:46.195616 133324340254528 pyconfig.py:471] Config param float32_qk_product: False
I0424 09:21:46.195631 133324340254528 pyconfig.py:471] Config param float32_weight_sum: True
I0424 09:21:46.195647 133324340254528 pyconfig.py:471] Config param force_q_layout: False
I0424 09:21:46.195670 133324340254528 pyconfig.py:471] Config param force_unroll: False
I0424 09:21:46.195686 133324340254528 pyconfig.py:471] Config param freeze_audio_encoder_params: True
I0424 09:21:46.195702 133324340254528 pyconfig.py:471] Config param freeze_vision_encoder_params: True
I0424 09:21:46.195718 133324340254528 pyconfig.py:471] Config param fused_mlp: False
I0424 09:21:46.195734 133324340254528 pyconfig.py:471] Config param fused_qkv: True
I0424 09:21:46.195750 133324340254528 pyconfig.py:471] Config param gcs_metrics: False
I0424 09:21:46.195766 133324340254528 pyconfig.py:471] Config param gdn_chunk_size: 64
I0424 09:21:46.195782 133324340254528 pyconfig.py:471] Config param gdn_conv_kernel_dim: 4
I0424 09:21:46.195797 133324340254528 pyconfig.py:471] Config param gdn_key_head_dim: 128
I0424 09:21:46.195813 133324340254528 pyconfig.py:471] Config param gdn_num_key_heads: 16
I0424 09:21:46.195829 133324340254528 pyconfig.py:471] Config param gdn_num_value_heads: 32
I0424 09:21:46.195845 133324340254528 pyconfig.py:471] Config param gdn_value_head_dim: 128
I0424 09:21:46.195861 133324340254528 pyconfig.py:471] Config param generate_padding_batch_eval: False
I0424 09:21:46.195877 133324340254528 pyconfig.py:471] Config param generate_padding_batch_train: False
I0424 09:21:46.195894 133324340254528 pyconfig.py:471] Config param generate_slice: v5e-16
I0424 09:21:46.195909 133324340254528 pyconfig.py:471] Config param generation_configs: {}
I0424 09:21:46.195925 133324340254528 pyconfig.py:471] Config param global_batch_size_to_eval_on: 64
I0424 09:21:46.195945 133324340254528 pyconfig.py:471] Config param global_batch_size_to_load: 512
I0424 09:21:46.195960 133324340254528 pyconfig.py:471] Config param global_batch_size_to_load_eval: 64
I0424 09:21:46.195975 133324340254528 pyconfig.py:471] Config param global_batch_size_to_load_increment: None
I0424 09:21:46.195991 133324340254528 pyconfig.py:471] Config param global_batch_size_to_load_start: None
I0424 09:21:46.196007 133324340254528 pyconfig.py:471] Config param global_batch_size_to_train_on: 512
I0424 09:21:46.196022 133324340254528 pyconfig.py:471] Config param global_head_dim: 0
I0424 09:21:46.196038 133324340254528 pyconfig.py:471] Config param global_num_kv_heads: 0
I0424 09:21:46.196054 133324340254528 pyconfig.py:471] Config param global_parameter_scale: 1
I0424 09:21:46.196070 133324340254528 pyconfig.py:471] Config param global_rampup_samples: 500
I0424 09:21:46.196086 133324340254528 pyconfig.py:471] Config param global_rope_max_timescale: -1
I0424 09:21:46.196102 133324340254528 pyconfig.py:471] Config param global_rope_proportion: 0.25
I0424 09:21:46.196120 133324340254528 pyconfig.py:471] Config param goodput_upload_interval_seconds: 30
I0424 09:21:46.196137 133324340254528 pyconfig.py:471] Config param grad_dtype: float32
I0424 09:21:46.196171 133324340254528 pyconfig.py:471] Config param gradient_accumulation_steps: 8
I0424 09:21:46.196188 133324340254528 pyconfig.py:471] Config param gradient_clipping_threshold: 1.0
I0424 09:21:46.196204 133324340254528 pyconfig.py:471] Config param grain_data_source_max_workers: 16
I0424 09:21:46.196219 133324340254528 pyconfig.py:471] Config param grain_eval_files: 
I0424 09:21:46.196235 133324340254528 pyconfig.py:471] Config param grain_file_type: arrayrecord
I0424 09:21:46.196251 133324340254528 pyconfig.py:471] Config param grain_num_threads: 16
I0424 09:21:46.196267 133324340254528 pyconfig.py:471] Config param grain_num_threads_eval: 16
I0424 09:21:46.196283 133324340254528 pyconfig.py:471] Config param grain_packing_type: first_fit
I0424 09:21:46.196299 133324340254528 pyconfig.py:471] Config param grain_per_worker_buffer_size: 1
I0424 09:21:46.196315 133324340254528 pyconfig.py:471] Config param grain_per_worker_buffer_size_eval: 1
I0424 09:21:46.196331 133324340254528 pyconfig.py:471] Config param grain_prefetch_buffer_size: 500
I0424 09:21:46.196347 133324340254528 pyconfig.py:471] Config param grain_prefetch_buffer_size_eval: 500
I0424 09:21:46.196363 133324340254528 pyconfig.py:471] Config param grain_ram_budget_mb: 1024
I0424 09:21:46.196379 133324340254528 pyconfig.py:471] Config param grain_shuffle_buffer_size: 100
I0424 09:21:46.196395 133324340254528 pyconfig.py:471] Config param grain_train_files: 
I0424 09:21:46.196411 133324340254528 pyconfig.py:471] Config param grain_train_mixture_config_path: 
I0424 09:21:46.196427 133324340254528 pyconfig.py:471] Config param grain_worker_count: 1
I0424 09:21:46.196443 133324340254528 pyconfig.py:471] Config param grain_worker_count_eval: 1
I0424 09:21:46.196458 133324340254528 pyconfig.py:471] Config param grpo_beta: 0.08
I0424 09:21:46.196474 133324340254528 pyconfig.py:471] Config param grpo_epsilon: 0.2
I0424 09:21:46.196491 133324340254528 pyconfig.py:471] Config param hardware: tpu
I0424 09:21:46.196507 133324340254528 pyconfig.py:471] Config param hbm_utilization_vllm: 0.72
I0424 09:21:46.196523 133324340254528 pyconfig.py:471] Config param head_dim: 8
I0424 09:21:46.196538 133324340254528 pyconfig.py:471] Config param heartbeat_reporting_interval_in_seconds: 5
I0424 09:21:46.196554 133324340254528 pyconfig.py:471] Config param hf_data_dir: None
I0424 09:21:46.196571 133324340254528 pyconfig.py:471] Config param hf_eval_files: None
I0424 09:21:46.196587 133324340254528 pyconfig.py:471] Config param hf_eval_split: None
I0424 09:21:46.196603 133324340254528 pyconfig.py:471] Config param hf_name: None
I0424 09:21:46.196620 133324340254528 pyconfig.py:471] Config param hf_path: OptimalScale/ClimbMix
I0424 09:21:46.196637 133324340254528 pyconfig.py:471] Config param hf_train_files: None
I0424 09:21:46.196661 133324340254528 pyconfig.py:471] Config param hidden_size_for_vit: 1408
I0424 09:21:46.196678 133324340254528 pyconfig.py:471] Config param hide_profiler_step_metric: False
I0424 09:21:46.196692 133324340254528 pyconfig.py:471] Config param ici_autoregressive_parallelism: 1
I0424 09:21:46.196708 133324340254528 pyconfig.py:471] Config param ici_context_autoregressive_parallelism: 1
I0424 09:21:46.196723 133324340254528 pyconfig.py:471] Config param ici_context_parallelism: 1
I0424 09:21:46.196738 133324340254528 pyconfig.py:471] Config param ici_data_parallelism: 1
I0424 09:21:46.196754 133324340254528 pyconfig.py:471] Config param ici_diloco_parallelism: 1
I0424 09:21:46.196770 133324340254528 pyconfig.py:471] Config param ici_expert_parallelism: 1
I0424 09:21:46.196786 133324340254528 pyconfig.py:471] Config param ici_fsdp_parallelism: -1
I0424 09:21:46.196801 133324340254528 pyconfig.py:471] Config param ici_fsdp_transpose_parallelism: 1
I0424 09:21:46.196816 133324340254528 pyconfig.py:471] Config param ici_parallelism: [1, 1, 1, -1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
I0424 09:21:46.196834 133324340254528 pyconfig.py:471] Config param ici_pipeline_parallelism: 1
I0424 09:21:46.196849 133324340254528 pyconfig.py:471] Config param ici_sequence_parallelism: 1
I0424 09:21:46.196864 133324340254528 pyconfig.py:471] Config param ici_tensor_parallelism: 1
I0424 09:21:46.196880 133324340254528 pyconfig.py:471] Config param ici_tensor_sequence_parallelism: 1
I0424 09:21:46.196894 133324340254528 pyconfig.py:471] Config param ici_tensor_transpose_parallelism: 1
I0424 09:21:46.196910 133324340254528 pyconfig.py:471] Config param image_path: 
I0424 09:21:46.196926 133324340254528 pyconfig.py:471] Config param image_placeholder: <|image|>
I0424 09:21:46.196945 133324340254528 pyconfig.py:471] Config param image_size_for_vit: 896
I0424 09:21:46.196961 133324340254528 pyconfig.py:471] Config param indexer_head_dim: 128
I0424 09:21:46.196975 133324340254528 pyconfig.py:471] Config param indexer_loss_scaling_factor: 0.0
I0424 09:21:46.196991 133324340254528 pyconfig.py:471] Config param indexer_n_heads: 64
I0424 09:21:46.197006 133324340254528 pyconfig.py:471] Config param indexer_sparse_training: False
I0424 09:21:46.197022 133324340254528 pyconfig.py:471] Config param indexer_topk: 2048
I0424 09:21:46.197038 133324340254528 pyconfig.py:471] Config param inference_benchmark_test: False
I0424 09:21:46.197054 133324340254528 pyconfig.py:471] Config param inference_metadata_file: 
I0424 09:21:46.197068 133324340254528 pyconfig.py:471] Config param inference_microbenchmark_log_file_path: 
I0424 09:21:46.197084 133324340254528 pyconfig.py:471] Config param inference_microbenchmark_loop_iters: 10
I0424 09:21:46.197099 133324340254528 pyconfig.py:471] Config param inference_microbenchmark_num_samples: [1, 2, 3, 4, 5]
I0424 09:21:46.197116 133324340254528 pyconfig.py:471] Config param inference_microbenchmark_prefill_lengths: 64,128,256,512,1024
I0424 09:21:46.197132 133324340254528 pyconfig.py:471] Config param inference_microbenchmark_stages: prefill,generate
I0424 09:21:46.197146 133324340254528 pyconfig.py:471] Config param inference_server: MaxtextInterleavedServer
I0424 09:21:46.197162 133324340254528 pyconfig.py:471] Config param inhomogeneous_layer_cycle_interval: 1
I0424 09:21:46.197178 133324340254528 pyconfig.py:471] Config param init_weights_seed: 0
I0424 09:21:46.197194 133324340254528 pyconfig.py:471] Config param input_data_sharding_logical_axes: ['activation_embed_and_logits_batch', 'activation_norm_length']
I0424 09:21:46.197210 133324340254528 pyconfig.py:471] Config param interleave_moe_layer_step: 1
I0424 09:21:46.197226 133324340254528 pyconfig.py:471] Config param intermediate_size_for_vit: 5632
I0424 09:21:46.197241 133324340254528 pyconfig.py:471] Config param internal_compile: False
I0424 09:21:46.197257 133324340254528 pyconfig.py:471] Config param internal_compile_num_devices: -1
I0424 09:21:46.197272 133324340254528 pyconfig.py:471] Config param jax_cache_dir: ~/jax_cache
I0424 09:21:46.197288 133324340254528 pyconfig.py:471] Config param jax_debug_log_modules: 
I0424 09:21:46.197304 133324340254528 pyconfig.py:471] Config param jax_distributed_initialization_timeout: 300
I0424 09:21:46.197320 133324340254528 pyconfig.py:471] Config param jax_profiler_port: 9999
I0424 09:21:46.197335 133324340254528 pyconfig.py:471] Config param key_proj: RematLocation.REMAT
I0424 09:21:46.197353 133324340254528 pyconfig.py:471] Config param kv_cache_buffer: 256
I0424 09:21:46.197369 133324340254528 pyconfig.py:471] Config param kv_lora_rank: 512
I0424 09:21:46.197385 133324340254528 pyconfig.py:471] Config param kv_quant_axis: KvQuantAxis.HEADS_AND_DKV
I0424 09:21:46.197403 133324340254528 pyconfig.py:471] Config param kv_quant_dtype: int8
I0424 09:21:46.197420 133324340254528 pyconfig.py:471] Config param kv_wa_proj: RematLocation.REMAT
I0424 09:21:46.197436 133324340254528 pyconfig.py:471] Config param learning_rate: 0.0002
I0424 09:21:46.197453 133324340254528 pyconfig.py:471] Config param learning_rate_final_fraction: 0.1
I0424 09:21:46.197469 133324340254528 pyconfig.py:471] Config param learning_rate_schedule_steps: 200000
I0424 09:21:46.197486 133324340254528 pyconfig.py:471] Config param load_balance_loss_weight: 0.0
I0424 09:21:46.197501 133324340254528 pyconfig.py:471] Config param load_checkpoint_only_once: False
I0424 09:21:46.197518 133324340254528 pyconfig.py:471] Config param load_from_prefill_dir: False
I0424 09:21:46.197533 133324340254528 pyconfig.py:471] Config param load_full_state_path: 
I0424 09:21:46.197550 133324340254528 pyconfig.py:471] Config param load_parameters_path: gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items
I0424 09:21:46.197566 133324340254528 pyconfig.py:471] Config param local_checkpoint_directory: 
I0424 09:21:46.197582 133324340254528 pyconfig.py:471] Config param local_checkpoint_period: 0
I0424 09:21:46.197597 133324340254528 pyconfig.py:471] Config param local_rope_max_timescale: -1
I0424 09:21:46.197614 133324340254528 pyconfig.py:471] Config param local_rope_proportion: 1.0
I0424 09:21:46.197630 133324340254528 pyconfig.py:471] Config param log_config: True
I0424 09:21:46.197646 133324340254528 pyconfig.py:471] Config param log_period: 10
I0424 09:21:46.197684 133324340254528 pyconfig.py:471] Config param logical_axis_rules: (('activation_embed_and_logits_batch', ('data', 'stage', 'fsdp', 'fsdp_transpose', 'expert')), ('activation_embed_and_logits_batch_sequence', ('data', 'stage', 'fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('activation_vocab', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_vocab', ('tensor', 'tensor_transpose')), ('activation_vocab', 'tensor_sequence'), ('activation_vocab', ('sequence', 'context')), ('vocab', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('embed_vocab', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('activation_heads', ('tensor', 'tensor_transpose', 'sequence', 'tensor_sequence', 'autoregressive')), ('activation_kv_heads', ('tensor', 'tensor_transpose', 'sequence', 'tensor_sequence')), ('activation_attn_length', ('sequence', 'context')), ('activation_attn_length', ('context',)), ('activation_q_length', ('context',)), ('activation_kv_length', ()), ('activation_attn_embed', ('tensor', 'tensor_transpose')), ('activation_kv', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_kv_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('activation_kv_head_dim', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('heads', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('q_heads', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('kv_heads', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('qkv', ()), ('kv', ()), ('kv_head_dim', ()), ('q_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'tensor_transpose', 'expert')), ('q_lora', ('fsdp', 'sequence', 'context', 'tensor_transpose', 'expert')), ('q_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('q_lora', ('fsdp', 'sequence', 'context', 'expert')), ('q_lora_up_proj', ()), ('kv_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'tensor_transpose', 'expert')), ('kv_lora', ('fsdp', 'sequence', 'context', 'tensor_transpose', 'expert')), ('kv_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('kv_lora', ('fsdp', 'sequence', 'context', 'expert')), ('kv_lora_up_proj', ()), ('activation_batch_moe', ('data', 'fsdp', 'fsdp_transpose')), ('activation_length_moe', ('sequence', 'context')), ('activation_length_moe', ('context',)), ('activation_norm_length_moe', ('tensor_sequence', 'context', 'sequence')), ('activation_embed_moe', ('tensor', 'tensor_transpose')), ('activation_mlp_moe', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_exp', ('expert',)), ('exp', 'expert'), ('mlp_moe', ('fsdp_transpose', 'tensor', 'tensor_sequence', 'autoregressive')), ('embed_moe', ('fsdp', 'fsdp_transpose', 'sequence', 'tensor_transpose', 'context')), ('embed_moe', ('fsdp', 'sequence', 'tensor_transpose', 'context')), ('embed_moe', ('fsdp', 'fsdp_transpose', 'sequence', 'context')), ('embed_moe', ('fsdp', 'sequence', 'context')), ('activation_mlp', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('activation_length', ('sequence', 'context')), ('activation_length', ('context',)), ('activation_norm_length', ('tensor_sequence', 'context', 'sequence')), ('activation_embed', ('tensor', 'tensor_transpose')), ('activation_stage', 'stage'), ('mlp', ('fsdp_transpose', 'tensor', 'tensor_sequence', 'autoregressive')), ('embed', ('fsdp', 'fsdp_transpose', 'sequence', 'tensor_transpose', 'context', 'expert')), ('embed', ('fsdp', 'sequence', 'tensor_transpose', 'context', 'expert')), ('embed', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('embed', ('fsdp', 'sequence', 'context', 'expert')), ('norm', ('tensor', 'tensor_transpose')), ('layers', 'stage'), ('diloco', 'diloco'), ('engram_dim', ('tensor',)), ('dense_layers', ()), ('moe_layers', ()), ('mhc', ()), ('prefill_activation_length', ('sequence', 'context')), ('prefill_activation_norm_length', ('tensor_sequence', 'context', 'sequence')), ('activation_prefill_kv_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('decode_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('decode_length', ('sequence',)), ('cache_heads', ('autoregressive', 'tensor', 'tensor_transpose', 'tensor_sequence')), ('cache_heads', ('autoregressive', 'tensor', 'tensor_sequence')), ('paged_kv_heads', ('tensor',)), ('cache_batch_prefill', ()), ('cache_batch', ()), ('cache_heads_none', ()), ('cache_kv', ()), ('cache_sequence', ()), ('num_pages', ()), ('tokens_per_page', ()), ('paged_kv_head_dim_size', ()), ('mlp_no_fsdp', ('tensor', 'tensor_sequence', 'autoregressive')), ('embed_tensor_transpose', ('tensor_transpose',)), ('exp_with_fsdp', 'fsdp'))
I0424 09:21:46.197755 133324340254528 pyconfig.py:471] Config param logits_dot_in_fp32: False
I0424 09:21:46.197789 133324340254528 pyconfig.py:471] Config param logits_via_embedding: True
I0424 09:21:46.197806 133324340254528 pyconfig.py:471] Config param lora_input_adapters_path: 
I0424 09:21:46.197822 133324340254528 pyconfig.py:471] Config param loss_algo: grpo
I0424 09:21:46.197838 133324340254528 pyconfig.py:471] Config param lr_schedule_type: LearningRateScheduleType.COSINE
I0424 09:21:46.197857 133324340254528 pyconfig.py:471] Config param managed_mldiagnostics: False
I0424 09:21:46.197872 133324340254528 pyconfig.py:471] Config param managed_mldiagnostics_dir: /deps/maxtext_output/gpt3-52k_2026-04-24-09-21/managed-mldiagnostics
I0424 09:21:46.197888 133324340254528 pyconfig.py:471] Config param managed_mldiagnostics_run_group: 
I0424 09:21:46.197904 133324340254528 pyconfig.py:471] Config param matmul_precision: MatmulPrecision.DEFAULT
I0424 09:21:46.197922 133324340254528 pyconfig.py:471] Config param max_checkify: False
I0424 09:21:46.197942 133324340254528 pyconfig.py:471] Config param max_concurrency: 256
I0424 09:21:46.197958 133324340254528 pyconfig.py:471] Config param max_corpus_chars: 10000000
I0424 09:21:46.197973 133324340254528 pyconfig.py:471] Config param max_num_batched_tokens: None
I0424 09:21:46.197989 133324340254528 pyconfig.py:471] Config param max_num_checkpoints_to_keep: None
I0424 09:21:46.198005 133324340254528 pyconfig.py:471] Config param max_num_images_per_example: -1
I0424 09:21:46.198021 133324340254528 pyconfig.py:471] Config param max_num_seqs: None
I0424 09:21:46.198036 133324340254528 pyconfig.py:471] Config param max_position_embeddings: 163840
I0424 09:21:46.198052 133324340254528 pyconfig.py:471] Config param max_prefill_predict_length: 64
I0424 09:21:46.198067 133324340254528 pyconfig.py:471] Config param max_sample_len_for_audio: 10000
I0424 09:21:46.198082 133324340254528 pyconfig.py:471] Config param max_segments_per_seq: -1
I0424 09:21:46.198098 133324340254528 pyconfig.py:471] Config param max_source_positions_for_audio: 1500
I0424 09:21:46.198114 133324340254528 pyconfig.py:471] Config param max_target_length: 2048
I0424 09:21:46.198130 133324340254528 pyconfig.py:471] Config param max_timescale_for_audio: 10000.0
I0424 09:21:46.198147 133324340254528 pyconfig.py:471] Config param megablox: True
I0424 09:21:46.198161 133324340254528 pyconfig.py:471] Config param merge_gating_gmm: False
I0424 09:21:46.198177 133324340254528 pyconfig.py:471] Config param mesh_axes: ['diloco', 'data', 'stage', 'fsdp', 'fsdp_transpose', 'sequence', 'context', 'context_autoregressive', 'tensor', 'tensor_transpose', 'tensor_sequence', 'expert', 'autoregressive']
I0424 09:21:46.198196 133324340254528 pyconfig.py:471] Config param metrics_dir: /deps/maxtext_output/gpt3-52k_2026-04-24-09-21/metrics/
I0424 09:21:46.198212 133324340254528 pyconfig.py:471] Config param metrics_file: 
I0424 09:21:46.198228 133324340254528 pyconfig.py:471] Config param mhc_expansion_rate: 1
I0424 09:21:46.198244 133324340254528 pyconfig.py:471] Config param micro_batch_size_to_eval_on: 64
I0424 09:21:46.198259 133324340254528 pyconfig.py:471] Config param micro_batch_size_to_train_on: 64
I0424 09:21:46.198276 133324340254528 pyconfig.py:471] Config param mla_kv: RematLocation.REMAT
I0424 09:21:46.198292 133324340254528 pyconfig.py:471] Config param mla_naive_kvcache: True
I0424 09:21:46.198308 133324340254528 pyconfig.py:471] Config param mla_q: RematLocation.REMAT
I0424 09:21:46.198324 133324340254528 pyconfig.py:471] Config param mlp_activations: ['gelu']
I0424 09:21:46.198340 133324340254528 pyconfig.py:471] Config param mlp_activations_limit: -1.0
I0424 09:21:46.198355 133324340254528 pyconfig.py:471] Config param mlp_bias: False
I0424 09:21:46.198371 133324340254528 pyconfig.py:471] Config param mlp_dim: 64
I0424 09:21:46.198387 133324340254528 pyconfig.py:471] Config param mlpwi: RematLocation.REMAT
I0424 09:21:46.198403 133324340254528 pyconfig.py:471] Config param mlpwi_0: RematLocation.REMAT
I0424 09:21:46.198419 133324340254528 pyconfig.py:471] Config param mlpwi_1: RematLocation.REMAT
I0424 09:21:46.198435 133324340254528 pyconfig.py:471] Config param mlpwo: RematLocation.REMAT
I0424 09:21:46.198451 133324340254528 pyconfig.py:471] Config param moba: False
I0424 09:21:46.198467 133324340254528 pyconfig.py:471] Config param moba_chunk_size: 1024
I0424 09:21:46.198483 133324340254528 pyconfig.py:471] Config param moba_topk: 8
I0424 09:21:46.198499 133324340254528 pyconfig.py:471] Config param model_call_mode: 
I0424 09:21:46.198514 133324340254528 pyconfig.py:471] Config param model_name: gpt3-52k
I0424 09:21:46.198530 133324340254528 pyconfig.py:471] Config param moe_expert_input_dim: -1
I0424 09:21:46.198545 133324340254528 pyconfig.py:471] Config param moe_fsdp_use_two_stage_all_gather: False
I0424 09:21:46.198561 133324340254528 pyconfig.py:471] Config param moe_mlp_dim: -1
I0424 09:21:46.198577 133324340254528 pyconfig.py:471] Config param moe_mlpwi_0: RematLocation.REMAT
I0424 09:21:46.198593 133324340254528 pyconfig.py:471] Config param moe_mlpwi_1: RematLocation.REMAT
I0424 09:21:46.198609 133324340254528 pyconfig.py:471] Config param moe_mlpwo: RematLocation.REMAT
I0424 09:21:46.198625 133324340254528 pyconfig.py:471] Config param monitor_goodput: False
I0424 09:21:46.198640 133324340254528 pyconfig.py:471] Config param monitor_step_time_deviation: True
I0424 09:21:46.198665 133324340254528 pyconfig.py:471] Config param mrope_section: [24, 20, 20]
I0424 09:21:46.198682 133324340254528 pyconfig.py:471] Config param mscale: 1.0
I0424 09:21:46.198698 133324340254528 pyconfig.py:471] Config param mtc_data_parallelism: 0
I0424 09:21:46.198714 133324340254528 pyconfig.py:471] Config param mtp_eval_target_module: 0
I0424 09:21:46.198730 133324340254528 pyconfig.py:471] Config param mtp_loss_scaling_factor: 0.1
I0424 09:21:46.198746 133324340254528 pyconfig.py:471] Config param mtp_num_layers: 0
I0424 09:21:46.198762 133324340254528 pyconfig.py:471] Config param mu_dtype: float32
I0424 09:21:46.198786 133324340254528 pyconfig.py:471] Config param multi_sampling: False
I0424 09:21:46.198802 133324340254528 pyconfig.py:471] Config param multi_tier_checkpointing_backup_interval_minutes: 0
I0424 09:21:46.198819 133324340254528 pyconfig.py:471] Config param muon_beta: 0.95
I0424 09:21:46.198835 133324340254528 pyconfig.py:471] Config param muon_consistent_rms: None
I0424 09:21:46.198852 133324340254528 pyconfig.py:471] Config param muon_weight_decay: 0.0
I0424 09:21:46.198868 133324340254528 pyconfig.py:471] Config param n_routing_groups: -1
I0424 09:21:46.198883 133324340254528 pyconfig.py:471] Config param n_window_for_audio: 50
I0424 09:21:46.198899 133324340254528 pyconfig.py:471] Config param n_window_infer_for_audio: 800
I0424 09:21:46.198915 133324340254528 pyconfig.py:471] Config param nope_layer_interval: -1
I0424 09:21:46.198931 133324340254528 pyconfig.py:471] Config param norm_topk_prob: False
I0424 09:21:46.198951 133324340254528 pyconfig.py:471] Config param normalization_layer_epsilon: 1e-05
I0424 09:21:46.198968 133324340254528 pyconfig.py:471] Config param normalize_embedding_logits: False
I0424 09:21:46.198984 133324340254528 pyconfig.py:471] Config param num_attention_heads_for_vit: 16
I0424 09:21:46.198999 133324340254528 pyconfig.py:471] Config param num_batches: 4
I0424 09:21:46.199014 133324340254528 pyconfig.py:471] Config param num_channels_for_vit: 3
I0424 09:21:46.199029 133324340254528 pyconfig.py:471] Config param num_conv_layers_for_audio: 3
I0424 09:21:46.199045 133324340254528 pyconfig.py:471] Config param num_decoder_layers: 1
I0424 09:21:46.199061 133324340254528 pyconfig.py:471] Config param num_diloco_replicas: 1
I0424 09:21:46.199077 133324340254528 pyconfig.py:471] Config param num_epoch: 1
I0424 09:21:46.199093 133324340254528 pyconfig.py:471] Config param num_eval_passes: 1
I0424 09:21:46.199108 133324340254528 pyconfig.py:471] Config param num_experts: 1
I0424 09:21:46.199123 133324340254528 pyconfig.py:471] Config param num_experts_per_tok: 1
I0424 09:21:46.199140 133324340254528 pyconfig.py:471] Config param num_generations: 2
I0424 09:21:46.199155 133324340254528 pyconfig.py:471] Config param num_hidden_layers_for_vit: 34
I0424 09:21:46.199171 133324340254528 pyconfig.py:471] Config param num_iterations: 1
I0424 09:21:46.199187 133324340254528 pyconfig.py:471] Config param num_kv_heads: 2
I0424 09:21:46.199204 133324340254528 pyconfig.py:471] Config param num_layers_per_pipeline_stage: 1
I0424 09:21:46.199219 133324340254528 pyconfig.py:471] Config param num_mel_bins_for_audio: 128
I0424 09:21:46.199235 133324340254528 pyconfig.py:471] Config param num_pipeline_microbatches: -1
I0424 09:21:46.199251 133324340254528 pyconfig.py:471] Config param num_pipeline_repeats: -1
I0424 09:21:46.199268 133324340254528 pyconfig.py:471] Config param num_position_embeddings_for_vit: 1024
I0424 09:21:46.199284 133324340254528 pyconfig.py:471] Config param num_query_heads: 2
I0424 09:21:46.199298 133324340254528 pyconfig.py:471] Config param num_samplers_slices: -1
I0424 09:21:46.199314 133324340254528 pyconfig.py:471] Config param num_slices: 1
I0424 09:21:46.199330 133324340254528 pyconfig.py:471] Config param num_target_devices: 32
I0424 09:21:46.199345 133324340254528 pyconfig.py:471] Config param num_test_batches: 5
I0424 09:21:46.199361 133324340254528 pyconfig.py:471] Config param num_trainer_slices: -1
I0424 09:21:46.199377 133324340254528 pyconfig.py:471] Config param num_vocab_tiling: 1
I0424 09:21:46.199392 133324340254528 pyconfig.py:471] Config param off_policy_steps: 0
I0424 09:21:46.199408 133324340254528 pyconfig.py:471] Config param offline_data_dir: None
I0424 09:21:46.199423 133324340254528 pyconfig.py:471] Config param opt_type: OptimizerType.ADAM_PAX
I0424 09:21:46.199440 133324340254528 pyconfig.py:471] Config param optimize_mesh_for_tpu_v6e: False
I0424 09:21:46.199455 133324340254528 pyconfig.py:471] Config param optimizer_memory_host_offload: False
I0424 09:21:46.199471 133324340254528 pyconfig.py:471] Config param original_max_position_embeddings: 4096
I0424 09:21:46.199486 133324340254528 pyconfig.py:471] Config param out_hidden_size_for_vit: 512
I0424 09:21:46.199501 133324340254528 pyconfig.py:471] Config param out_proj: RematLocation.REMAT
I0424 09:21:46.199517 133324340254528 pyconfig.py:471] Config param output_dim_for_audio: 512
I0424 09:21:46.199533 133324340254528 pyconfig.py:471] Config param override_logical_axis_rules: False
I0424 09:21:46.199549 133324340254528 pyconfig.py:471] Config param override_model_config: True
I0424 09:21:46.199565 133324340254528 pyconfig.py:471] Config param packing: True
I0424 09:21:46.199580 133324340254528 pyconfig.py:471] Config param pagedattn_head_dim_alignment: 128
I0424 09:21:46.199596 133324340254528 pyconfig.py:471] Config param pagedattn_max_pages_per_group: -1
I0424 09:21:46.199611 133324340254528 pyconfig.py:471] Config param pagedattn_num_pages: 64
I0424 09:21:46.199627 133324340254528 pyconfig.py:471] Config param pagedattn_pages_per_compute_block: 4
I0424 09:21:46.199643 133324340254528 pyconfig.py:471] Config param pagedattn_tokens_per_page: 32
I0424 09:21:46.199668 133324340254528 pyconfig.py:471] Config param param_scan_axis: 1
I0424 09:21:46.199684 133324340254528 pyconfig.py:471] Config param parameter_memory_host_offload: False
I0424 09:21:46.199698 133324340254528 pyconfig.py:471] Config param partial_rotary_factor: 1.0
I0424 09:21:46.199715 133324340254528 pyconfig.py:471] Config param patch_size_for_vit: 14
I0424 09:21:46.199731 133324340254528 pyconfig.py:471] Config param penalty_incorrect_answer: -1.0
I0424 09:21:46.199747 133324340254528 pyconfig.py:471] Config param penalty_incorrect_format: -0.5
I0424 09:21:46.199764 133324340254528 pyconfig.py:471] Config param per_device_batch_size: 2
I0424 09:21:46.199779 133324340254528 pyconfig.py:471] Config param per_device_batch_size_increment: 2.0
I0424 09:21:46.199794 133324340254528 pyconfig.py:471] Config param per_device_batch_size_start: 4.0
I0424 09:21:46.199810 133324340254528 pyconfig.py:471] Config param pipeline_delay_activation_forwarding: False
I0424 09:21:46.199826 133324340254528 pyconfig.py:471] Config param pipeline_fsdp_ag_once: False
I0424 09:21:46.199841 133324340254528 pyconfig.py:471] Config param pipeline_fsdp_ag_per_repeat: False
I0424 09:21:46.199857 133324340254528 pyconfig.py:471] Config param pipeline_parallel_layers: 1
I0424 09:21:46.199872 133324340254528 pyconfig.py:471] Config param pixel_shuffle_ratio_for_vit: 0.5
I0424 09:21:46.199888 133324340254528 pyconfig.py:471] Config param posemb_type_for_vit: learn
I0424 09:21:46.199904 133324340254528 pyconfig.py:471] Config param position_id_per_seconds: 25
I0424 09:21:46.199920 133324340254528 pyconfig.py:471] Config param prefill_cache_axis_order: 1,2,0,3
I0424 09:21:46.199939 133324340254528 pyconfig.py:471] Config param prefill_cache_dir: 
I0424 09:21:46.199955 133324340254528 pyconfig.py:471] Config param prefill_chunk_size: 256
I0424 09:21:46.199971 133324340254528 pyconfig.py:471] Config param prefill_slice: v5e-16
I0424 09:21:46.199987 133324340254528 pyconfig.py:471] Config param prefix_caching_dram_byte: 100000000000
I0424 09:21:46.200003 133324340254528 pyconfig.py:471] Config param prefix_caching_hbm_byte: 10000000000
I0424 09:21:46.200019 133324340254528 pyconfig.py:471] Config param profile_cleanly: True
I0424 09:21:46.200034 133324340254528 pyconfig.py:471] Config param profile_periodically_period: -1
I0424 09:21:46.200051 133324340254528 pyconfig.py:471] Config param profile_power_events: False
I0424 09:21:46.200067 133324340254528 pyconfig.py:471] Config param profiler: ProfilerType.NONE
I0424 09:21:46.200084 133324340254528 pyconfig.py:471] Config param profiler_steps: 5
I0424 09:21:46.200099 133324340254528 pyconfig.py:471] Config param projector_dropout_for_vit: 0.0
I0424 09:21:46.200115 133324340254528 pyconfig.py:471] Config param projector_input_dim_for_vit: 4096
I0424 09:21:46.200130 133324340254528 pyconfig.py:471] Config param projector_output_dim_for_vit: 4096
I0424 09:21:46.200146 133324340254528 pyconfig.py:471] Config param prometheus_port: 0
I0424 09:21:46.200161 133324340254528 pyconfig.py:471] Config param prompt: I love to
I0424 09:21:46.200176 133324340254528 pyconfig.py:471] Config param pure_nnx: False
I0424 09:21:46.200192 133324340254528 pyconfig.py:471] Config param pure_nnx_decoder: False
I0424 09:21:46.200208 133324340254528 pyconfig.py:471] Config param q_lora_rank: 0
I0424 09:21:46.200224 133324340254528 pyconfig.py:471] Config param qk_clip_threshold: 100.0
I0424 09:21:46.200240 133324340254528 pyconfig.py:471] Config param qk_nope_head_dim: 128
I0424 09:21:46.200255 133324340254528 pyconfig.py:471] Config param qk_norm_with_scale: True
I0424 09:21:46.200269 133324340254528 pyconfig.py:471] Config param qk_rope_head_dim: 64
I0424 09:21:46.200286 133324340254528 pyconfig.py:471] Config param qkv_proj: RematLocation.REMAT
I0424 09:21:46.200303 133324340254528 pyconfig.py:471] Config param quant_cfg_path: 
I0424 09:21:46.200319 133324340254528 pyconfig.py:471] Config param quantization: QuantizationType.NONE
I0424 09:21:46.200337 133324340254528 pyconfig.py:471] Config param quantization_local_shard_count: 4
I0424 09:21:46.200352 133324340254528 pyconfig.py:471] Config param quantize_kvcache: False
I0424 09:21:46.200368 133324340254528 pyconfig.py:471] Config param query_proj: RematLocation.REMAT
I0424 09:21:46.200385 133324340254528 pyconfig.py:471] Config param query_wa_proj: RematLocation.REMAT
I0424 09:21:46.200401 133324340254528 pyconfig.py:471] Config param ragged_block_size: 256
I0424 09:21:46.200417 133324340254528 pyconfig.py:471] Config param ragged_buffer_factor: -1.0
I0424 09:21:46.200433 133324340254528 pyconfig.py:471] Config param rampup_end_step: 0
I0424 09:21:46.200449 133324340254528 pyconfig.py:471] Config param rampup_samples_per_increment_to_load: None
I0424 09:21:46.200465 133324340254528 pyconfig.py:471] Config param reasoning_end_token: </reasoning>
I0424 09:21:46.200481 133324340254528 pyconfig.py:471] Config param reasoning_start_token: <reasoning>
I0424 09:21:46.200497 133324340254528 pyconfig.py:471] Config param record_internal_nn_metrics: 0
I0424 09:21:46.200513 133324340254528 pyconfig.py:471] Config param remat_policy: full
I0424 09:21:46.200529 133324340254528 pyconfig.py:471] Config param remat_policy_for_vit: minimal
I0424 09:21:46.200543 133324340254528 pyconfig.py:471] Config param remove_size_one_mesh_axis_from_type: True
I0424 09:21:46.200559 133324340254528 pyconfig.py:471] Config param replicate_quant_scale: False
I0424 09:21:46.200575 133324340254528 pyconfig.py:471] Config param replicator_backup_interval_minutes: 0
I0424 09:21:46.200591 133324340254528 pyconfig.py:471] Config param report_heartbeat_metric_for_gcp_monitoring: False
I0424 09:21:46.200606 133324340254528 pyconfig.py:471] Config param report_performance_metric_for_gcp_monitoring: False
I0424 09:21:46.200622 133324340254528 pyconfig.py:471] Config param reshape_q: False
I0424 09:21:46.200637 133324340254528 pyconfig.py:471] Config param return_log_prob: False
I0424 09:21:46.200659 133324340254528 pyconfig.py:471] Config param reuse_example_batch: 0
I0424 09:21:46.200676 133324340254528 pyconfig.py:471] Config param reward_exact_answer: 5.0
I0424 09:21:46.200692 133324340254528 pyconfig.py:471] Config param reward_exact_format_match: 3.0
I0424 09:21:46.200708 133324340254528 pyconfig.py:471] Config param reward_partial_format_match: 0.5
I0424 09:21:46.200724 133324340254528 pyconfig.py:471] Config param reward_ratio_guess_to_answer_high: 0.5
I0424 09:21:46.200741 133324340254528 pyconfig.py:471] Config param reward_ratio_guess_to_answer_low: 0.25
I0424 09:21:46.200757 133324340254528 pyconfig.py:471] Config param reward_white_space_format_match: 1.5
I0424 09:21:46.200773 133324340254528 pyconfig.py:471] Config param rl: {'num_generations': 2, 'num_iterations': 1, 'grpo_beta': 0.08, 'grpo_epsilon': 0.2, 'loss_algo': 'grpo', 'use_agentic_rollout': False, 'max_concurrency': 256, 'off_policy_steps': 0, 'system_prompt': '', 'degenerate_group_masking': True, 'epsilon_high': None}
I0424 09:21:46.200794 133324340254528 pyconfig.py:471] Config param rollout_data_parallelism: -1
I0424 09:21:46.200809 133324340254528 pyconfig.py:471] Config param rollout_expert_parallelism: 1
I0424 09:21:46.200825 133324340254528 pyconfig.py:471] Config param rollout_micro_batch_size: -1
I0424 09:21:46.200841 133324340254528 pyconfig.py:471] Config param rollout_tensor_parallelism: -1
I0424 09:21:46.200857 133324340254528 pyconfig.py:471] Config param rope_attention_scaling: False
I0424 09:21:46.200874 133324340254528 pyconfig.py:471] Config param rope_factor: 40
I0424 09:21:46.200889 133324340254528 pyconfig.py:471] Config param rope_interleave: True
I0424 09:21:46.200904 133324340254528 pyconfig.py:471] Config param rope_linear_scaling_factor: 1.0
I0424 09:21:46.200920 133324340254528 pyconfig.py:471] Config param rope_max_timescale: 10000
I0424 09:21:46.200938 133324340254528 pyconfig.py:471] Config param rope_min_timescale: 1
I0424 09:21:46.200955 133324340254528 pyconfig.py:471] Config param rope_theta_for_vit: 10000
I0424 09:21:46.200971 133324340254528 pyconfig.py:471] Config param rope_truncate: True
I0424 09:21:46.200987 133324340254528 pyconfig.py:471] Config param rope_type: RopeType.DEFAULT
I0424 09:21:46.201005 133324340254528 pyconfig.py:471] Config param rope_use_scale: True
I0424 09:21:46.201021 133324340254528 pyconfig.py:471] Config param routed_bias: False
I0424 09:21:46.201037 133324340254528 pyconfig.py:471] Config param routed_bias_update_rate: 0.0
I0424 09:21:46.201052 133324340254528 pyconfig.py:471] Config param routed_scaling_factor: 1.0
I0424 09:21:46.201068 133324340254528 pyconfig.py:471] Config param routed_score_func: 
I0424 09:21:46.201084 133324340254528 pyconfig.py:471] Config param run_name: gpt3-52k_2026-04-24-09-21
I0424 09:21:46.201099 133324340254528 pyconfig.py:471] Config param sa_block_kv: 512
I0424 09:21:46.201115 133324340254528 pyconfig.py:471] Config param sa_block_kv_compute: 512
I0424 09:21:46.201131 133324340254528 pyconfig.py:471] Config param sa_block_kv_dkv: 512
I0424 09:21:46.201147 133324340254528 pyconfig.py:471] Config param sa_block_kv_dkv_compute: 512
I0424 09:21:46.201163 133324340254528 pyconfig.py:471] Config param sa_block_kv_dq: 512
I0424 09:21:46.201178 133324340254528 pyconfig.py:471] Config param sa_block_q: 512
I0424 09:21:46.201195 133324340254528 pyconfig.py:471] Config param sa_block_q_dkv: 512
I0424 09:21:46.201211 133324340254528 pyconfig.py:471] Config param sa_block_q_dq: 512
I0424 09:21:46.201228 133324340254528 pyconfig.py:471] Config param sa_k_layout: HEAD_DIM_MINOR
I0424 09:21:46.201244 133324340254528 pyconfig.py:471] Config param sa_q_layout: HEAD_DIM_MINOR
I0424 09:21:46.201260 133324340254528 pyconfig.py:471] Config param sa_use_fused_bwd_kernel: False
I0424 09:21:46.201276 133324340254528 pyconfig.py:471] Config param sa_v_layout: HEAD_DIM_MINOR
I0424 09:21:46.201291 133324340254528 pyconfig.py:471] Config param sampler_devices_fraction: 0.5
I0424 09:21:46.201308 133324340254528 pyconfig.py:471] Config param save_checkpoint_on_completion: True
I0424 09:21:46.201323 133324340254528 pyconfig.py:471] Config param save_config_to_gcs: False
I0424 09:21:46.201339 133324340254528 pyconfig.py:471] Config param save_quantized_params_path: 
I0424 09:21:46.201355 133324340254528 pyconfig.py:471] Config param scale_embedding_for_audio: True
I0424 09:21:46.201371 133324340254528 pyconfig.py:471] Config param scan_layers: True
I0424 09:21:46.201386 133324340254528 pyconfig.py:471] Config param scan_layers_per_stage: False
I0424 09:21:46.201401 133324340254528 pyconfig.py:471] Config param scan_pipeline_iterations: True
I0424 09:21:46.201417 133324340254528 pyconfig.py:471] Config param scan_pipeline_repeats: False
I0424 09:21:46.201433 133324340254528 pyconfig.py:471] Config param set_remat_policy_on_layers_per_stage: False
I0424 09:21:46.201449 133324340254528 pyconfig.py:471] Config param set_remat_policy_on_pipeline_iterations: True
I0424 09:21:46.201465 133324340254528 pyconfig.py:471] Config param sft_train_on_completion_only: False
I0424 09:21:46.201481 133324340254528 pyconfig.py:471] Config param shard_exp_on_fsdp: False
I0424 09:21:46.201496 133324340254528 pyconfig.py:471] Config param shard_mode: ShardMode.AUTO
I0424 09:21:46.201513 133324340254528 pyconfig.py:471] Config param shard_optimizer_over_data: False
I0424 09:21:46.201529 133324340254528 pyconfig.py:471] Config param sharding_strategy: None
I0424 09:21:46.201545 133324340254528 pyconfig.py:471] Config param sharding_tolerance: 0.02
I0424 09:21:46.201561 133324340254528 pyconfig.py:471] Config param shardy: True
I0424 09:21:46.201576 133324340254528 pyconfig.py:471] Config param share_kv_projections: False
I0424 09:21:46.201592 133324340254528 pyconfig.py:471] Config param shared_experts: 0
I0424 09:21:46.201608 133324340254528 pyconfig.py:471] Config param sinkhorn_iterations: 20
I0424 09:21:46.201624 133324340254528 pyconfig.py:471] Config param skip_first_n_steps_for_profiler: 1
I0424 09:21:46.201639 133324340254528 pyconfig.py:471] Config param skip_jax_distributed_system: False
I0424 09:21:46.201666 133324340254528 pyconfig.py:471] Config param skip_step_interval: 128
I0424 09:21:46.201683 133324340254528 pyconfig.py:471] Config param skip_step_on_spikes: False
I0424 09:21:46.201698 133324340254528 pyconfig.py:471] Config param skip_step_scaling_factor: 6.0
I0424 09:21:46.201714 133324340254528 pyconfig.py:471] Config param sliding_window_size: 0
I0424 09:21:46.201730 133324340254528 pyconfig.py:471] Config param solution_end_token: </answer>
I0424 09:21:46.201745 133324340254528 pyconfig.py:471] Config param solution_start_token: <answer>
I0424 09:21:46.201760 133324340254528 pyconfig.py:471] Config param source_checkpoint_layout: orbax
I0424 09:21:46.201776 133324340254528 pyconfig.py:471] Config param sparse_matmul: True
I0424 09:21:46.201792 133324340254528 pyconfig.py:471] Config param spatial_merge_size_for_vit: 2
I0424 09:21:46.201807 133324340254528 pyconfig.py:471] Config param stack_prefill_result_cache: False
I0424 09:21:46.201823 133324340254528 pyconfig.py:471] Config param stack_trace_interval_seconds: 600
I0424 09:21:46.201839 133324340254528 pyconfig.py:471] Config param stack_trace_to_cloud: False
I0424 09:21:46.201853 133324340254528 pyconfig.py:471] Config param step_deviation_interval_seconds: 30
I0424 09:21:46.201869 133324340254528 pyconfig.py:471] Config param steps: 200000
I0424 09:21:46.201884 133324340254528 pyconfig.py:471] Config param stop_strings: None
I0424 09:21:46.201900 133324340254528 pyconfig.py:471] Config param student_overrides: {'model_name': 'llama3.1-8b'}
I0424 09:21:46.201916 133324340254528 pyconfig.py:471] Config param student_params_to_update: None
I0424 09:21:46.201931 133324340254528 pyconfig.py:471] Config param subslice_shape: 
I0424 09:21:46.201951 133324340254528 pyconfig.py:471] Config param swap_space_vllm_gb: 2
I0424 09:21:46.201967 133324340254528 pyconfig.py:471] Config param system_prompt: 
I0424 09:21:46.201983 133324340254528 pyconfig.py:471] Config param target_eval_loss: 0.0
I0424 09:21:46.201999 133324340254528 pyconfig.py:471] Config param teacher_overrides: {'model_name': 'llama3.1-8b'}
I0424 09:21:46.202014 133324340254528 pyconfig.py:471] Config param temperature_tuning: False
I0424 09:21:46.202030 133324340254528 pyconfig.py:471] Config param temporal_patch_size_for_vit: 2
I0424 09:21:46.202046 133324340254528 pyconfig.py:471] Config param tensorboard_dir: /deps/maxtext_output/gpt3-52k_2026-04-24-09-21/tensorboard/
I0424 09:21:46.202062 133324340254528 pyconfig.py:471] Config param tensors_on_device: None
I0424 09:21:46.202077 133324340254528 pyconfig.py:471] Config param tensors_to_offload: None
I0424 09:21:46.202093 133324340254528 pyconfig.py:471] Config param test_batch_start_index: 0
I0424 09:21:46.202109 133324340254528 pyconfig.py:471] Config param tile_size_for_vit: 336
I0424 09:21:46.202123 133324340254528 pyconfig.py:471] Config param tokenize_eval_data: True
I0424 09:21:46.202139 133324340254528 pyconfig.py:471] Config param tokenize_train_data: True
I0424 09:21:46.202154 133324340254528 pyconfig.py:471] Config param tokenizer_path: meta-llama/Llama-3.1-8B
I0424 09:21:46.202170 133324340254528 pyconfig.py:471] Config param tokenizer_type: TokenizerType.HUGGINGFACE
I0424 09:21:46.202188 133324340254528 pyconfig.py:471] Config param topk_routing_group: -1
I0424 09:21:46.202202 133324340254528 pyconfig.py:471] Config param train_data_columns: ['text']
I0424 09:21:46.202218 133324340254528 pyconfig.py:471] Config param train_fraction: 1.0
I0424 09:21:46.202234 133324340254528 pyconfig.py:471] Config param train_image_column: image
I0424 09:21:46.202250 133324340254528 pyconfig.py:471] Config param train_micro_batch_size: -1
I0424 09:21:46.202266 133324340254528 pyconfig.py:471] Config param train_split: train
I0424 09:21:46.202281 133324340254528 pyconfig.py:471] Config param trainable_parameters_mask: []
I0424 09:21:46.202297 133324340254528 pyconfig.py:471] Config param trainable_position_size: 2048
I0424 09:21:46.202313 133324340254528 pyconfig.py:471] Config param trainer_devices_fraction: 0.5
I0424 09:21:46.202330 133324340254528 pyconfig.py:471] Config param upload_all_profiler_results: False
I0424 09:21:46.202346 133324340254528 pyconfig.py:471] Config param use_2d_fsdp_sharding: False
I0424 09:21:46.202361 133324340254528 pyconfig.py:471] Config param use_agentic_rollout: False
I0424 09:21:46.202377 133324340254528 pyconfig.py:471] Config param use_audio: False
I0424 09:21:46.202392 133324340254528 pyconfig.py:471] Config param use_audio_in_video: False
I0424 09:21:46.202407 133324340254528 pyconfig.py:471] Config param use_batch_split_schedule: False
I0424 09:21:46.202423 133324340254528 pyconfig.py:471] Config param use_chat_template: False
I0424 09:21:46.202440 133324340254528 pyconfig.py:471] Config param use_chunked_prefill: False
I0424 09:21:46.202455 133324340254528 pyconfig.py:471] Config param use_custom_sort_vjp: True
I0424 09:21:46.202471 133324340254528 pyconfig.py:471] Config param use_dpo: False
I0424 09:21:46.202486 133324340254528 pyconfig.py:471] Config param use_gather_mosaic_kernel: False
I0424 09:21:46.202502 133324340254528 pyconfig.py:471] Config param use_grpo: True
I0424 09:21:46.202518 133324340254528 pyconfig.py:471] Config param use_indexer: False
I0424 09:21:46.202534 133324340254528 pyconfig.py:471] Config param use_iota_embed: True
I0424 09:21:46.202550 133324340254528 pyconfig.py:471] Config param use_jax_splash: False
I0424 09:21:46.202565 133324340254528 pyconfig.py:471] Config param use_max_logit_estimate: -1
I0424 09:21:46.202580 133324340254528 pyconfig.py:471] Config param use_mrope: False
I0424 09:21:46.202596 133324340254528 pyconfig.py:471] Config param use_multimodal: False
I0424 09:21:46.202612 133324340254528 pyconfig.py:471] Config param use_pathways: True
I0424 09:21:46.202628 133324340254528 pyconfig.py:471] Config param use_post_attn_norm: False
I0424 09:21:46.202644 133324340254528 pyconfig.py:471] Config param use_post_ffw_norm: False
I0424 09:21:46.202670 133324340254528 pyconfig.py:471] Config param use_qk_clip: False
I0424 09:21:46.202684 133324340254528 pyconfig.py:471] Config param use_qk_norm: False
I0424 09:21:46.202700 133324340254528 pyconfig.py:471] Config param use_qk_norm_in_gdn: True
I0424 09:21:46.202716 133324340254528 pyconfig.py:471] Config param use_qwix_quantization: False
I0424 09:21:46.202732 133324340254528 pyconfig.py:471] Config param use_ragged_attention: False
I0424 09:21:46.202747 133324340254528 pyconfig.py:471] Config param use_random_routing: False
I0424 09:21:46.202764 133324340254528 pyconfig.py:471] Config param use_replicator_service: False
I0424 09:21:46.202780 133324340254528 pyconfig.py:471] Config param use_ring_of_experts: False
I0424 09:21:46.202795 133324340254528 pyconfig.py:471] Config param use_sft: False
I0424 09:21:46.202810 133324340254528 pyconfig.py:471] Config param use_splash_scheduler: False
I0424 09:21:46.202826 133324340254528 pyconfig.py:471] Config param use_tokamax_gmm: False
I0424 09:21:46.202841 133324340254528 pyconfig.py:471] Config param use_tokamax_splash: False
I0424 09:21:46.202857 133324340254528 pyconfig.py:471] Config param use_truncation: True
I0424 09:21:46.202873 133324340254528 pyconfig.py:471] Config param use_tunix_gradient_accumulation: False
I0424 09:21:46.202889 133324340254528 pyconfig.py:471] Config param use_untrainable_positional_embedding: False
I0424 09:21:46.202905 133324340254528 pyconfig.py:471] Config param use_vertex_tensorboard: False
I0424 09:21:46.202921 133324340254528 pyconfig.py:471] Config param using_pipeline_parallelism: False
I0424 09:21:46.202940 133324340254528 pyconfig.py:471] Config param v_head_dim: 128
I0424 09:21:46.202956 133324340254528 pyconfig.py:471] Config param v_norm_with_scale: True
I0424 09:21:46.202972 133324340254528 pyconfig.py:471] Config param value_proj: RematLocation.REMAT
I0424 09:21:46.202988 133324340254528 pyconfig.py:471] Config param vertex_tensorboard_project: 
I0424 09:21:46.203004 133324340254528 pyconfig.py:471] Config param vertex_tensorboard_region: 
I0424 09:21:46.203020 133324340254528 pyconfig.py:471] Config param video_path: 
I0424 09:21:46.203036 133324340254528 pyconfig.py:471] Config param video_placeholder: <|video|>
I0424 09:21:46.203052 133324340254528 pyconfig.py:471] Config param vision_output_dim_for_vit: 4096
I0424 09:21:46.203068 133324340254528 pyconfig.py:471] Config param vision_output_length: -1
I0424 09:21:46.203083 133324340254528 pyconfig.py:471] Config param vllm_additional_config: {}
I0424 09:21:46.203100 133324340254528 pyconfig.py:471] Config param vllm_hf_config_path: 
I0424 09:21:46.203115 133324340254528 pyconfig.py:471] Config param vllm_hf_overrides: {}
I0424 09:21:46.203132 133324340254528 pyconfig.py:471] Config param vocab_size: 32000
I0424 09:21:46.203147 133324340254528 pyconfig.py:471] Config param warmup_steps_fraction: 0.1
I0424 09:21:46.203163 133324340254528 pyconfig.py:471] Config param weight_dtype: float32
I0424 09:21:46.203188 133324340254528 pyconfig.py:471] Config param weight_quantization_calibration_method: absmax
I0424 09:21:46.203204 133324340254528 pyconfig.py:471] Config param wi_tile_dlhs_batch_seq: 512
I0424 09:21:46.203220 133324340254528 pyconfig.py:471] Config param wi_tile_dlhs_embed_dim: 1024
I0424 09:21:46.203236 133324340254528 pyconfig.py:471] Config param wi_tile_dlhs_mlp_dim: 1024
I0424 09:21:46.203251 133324340254528 pyconfig.py:471] Config param wi_tile_drhs_batch_seq: 512
I0424 09:21:46.203267 133324340254528 pyconfig.py:471] Config param wi_tile_drhs_embed_dim: 1024
I0424 09:21:46.203283 133324340254528 pyconfig.py:471] Config param wi_tile_drhs_mlp_dim: 1024
I0424 09:21:46.203298 133324340254528 pyconfig.py:471] Config param wi_tile_fwd_batch_seq: 512
I0424 09:21:46.203314 133324340254528 pyconfig.py:471] Config param wi_tile_fwd_embed_dim: 1024
I0424 09:21:46.203330 133324340254528 pyconfig.py:471] Config param wi_tile_fwd_mlp_dim: 1024
I0424 09:21:46.203344 133324340254528 pyconfig.py:471] Config param wo_tile_dlhs_batch_seq: 512
I0424 09:21:46.203358 133324340254528 pyconfig.py:471] Config param wo_tile_dlhs_embed_dim: 1024
I0424 09:21:46.203373 133324340254528 pyconfig.py:471] Config param wo_tile_dlhs_mlp_dim: 1024
I0424 09:21:46.203389 133324340254528 pyconfig.py:471] Config param wo_tile_drhs_batch_seq: 512
I0424 09:21:46.203403 133324340254528 pyconfig.py:471] Config param wo_tile_drhs_embed_dim: 1024
I0424 09:21:46.203419 133324340254528 pyconfig.py:471] Config param wo_tile_drhs_mlp_dim: 1024
I0424 09:21:46.203434 133324340254528 pyconfig.py:471] Config param wo_tile_fwd_batch_seq: 512
I0424 09:21:46.203450 133324340254528 pyconfig.py:471] Config param wo_tile_fwd_embed_dim: 1024
I0424 09:21:46.203464 133324340254528 pyconfig.py:471] Config param wo_tile_fwd_mlp_dim: 1024
I0424 09:21:46.203481 133324340254528 pyconfig.py:471] Config param wsd_decay_steps_fraction: 0.1
I0424 09:21:46.203497 133324340254528 pyconfig.py:471] Config param wsd_decay_style: WsdDecayStyle.LINEAR
I0424 09:21:46.203513 133324340254528 pyconfig.py:471] Config param xprof_e2e_enable_fw_power_level_event: False
I0424 09:21:46.203528 133324340254528 pyconfig.py:471] Config param xprof_e2e_enable_fw_thermal_event: False
I0424 09:21:46.203544 133324340254528 pyconfig.py:471] Config param xprof_e2e_enable_fw_throttle_event: False
I0424 09:21:46.203559 133324340254528 pyconfig.py:471] Config param xprof_tpu_power_trace_level: 0
I0424 09:21:46.203577 133324340254528 pyconfig.py:471] Config param z_loss_multiplier: 0.0
I0424 09:21:46.203894 133324340254528 tokenizer.py:245] Tokenizer path: meta-llama/Llama-2-7b-chat-hf
I0424 09:21:46.203932 133324340254528 tokenizer.py:224] Loading HF tokenizer: meta-llama/Llama-2-7b-chat-hf
I0424 09:21:49.837718 133324340254528 _schedule.py:129] A polynomial schedule was set with a non-positive `transition_steps` value; this results in a constant schedule with value `init_value`.
I0424 09:21:49.840821 133324340254528 maxtext_utils.py:1732] Num_devices: 32, shape (1, 4, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0424 09:21:49.840946 133324340254528 train_distill.py:596] Applying logical axis rules for model initialization and training...
I0424 09:21:49.841023 133324340254528 train_distill.py:600] Loading Student from ...
I0424 09:21:49.841052 133324340254528 train_distill.py:169] --- Student Configuration ---
I0424 09:21:49.841074 133324340254528 train_distill.py:170]   Model Name:      gpt3-52k
I0424 09:21:49.841095 133324340254528 train_distill.py:171]   Dimensions:      1 Layers, 16 Emb Dim, 8 Head Dim
I0424 09:21:49.841114 133324340254528 train_distill.py:174]   Attention Heads: 2 Query, 2 KV
I0424 09:21:49.841132 133324340254528 train_distill.py:175]   Vocab Size:      32000
I0424 09:21:49.841150 133324340254528 train_distill.py:176]   Checkpoint:      
I0424 09:21:49.841167 133324340254528 train_distill.py:465] Initializing model: gpt3-52k...
I0424 09:21:51.221610 133324340254528 train_distill.py:614] Loading Teacher from gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items...
I0424 09:21:51.221729 133324340254528 train_distill.py:169] --- Teacher Configuration ---
I0424 09:21:51.221759 133324340254528 train_distill.py:170]   Model Name:      gpt3-52k
I0424 09:21:51.221784 133324340254528 train_distill.py:171]   Dimensions:      1 Layers, 16 Emb Dim, 8 Head Dim
I0424 09:21:51.221807 133324340254528 train_distill.py:174]   Attention Heads: 2 Query, 2 KV
I0424 09:21:51.221826 133324340254528 train_distill.py:175]   Vocab Size:      32000
I0424 09:21:51.221845 133324340254528 train_distill.py:176]   Checkpoint:      gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items
I0424 09:21:51.221863 133324340254528 train_distill.py:465] Initializing model: gpt3-52k...
I0424 09:21:52.269027 133324340254528 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0424 09:21:52.269475 133324340254528 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7941466f2210>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 09:21:52.269535 133324340254528 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
W0424 09:21:52.847217 133324340254528 checkpoint.py:202] Metadata file does not exist: gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items/_CHECKPOINT_METADATA
I0424 09:21:53.422332    2146 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0424 09:21:54.588738 133324340254528 checkpointer.py:304] Restoring checkpoint from gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items.
W0424 09:21:56.724369 133324340254528 transform_utils.py:230] The transformations API will eventually be replaced by an upgraded design. The current API will not be removed until this point, but it will no longer be actively worked on.
I0424 09:21:56.724758 133324340254528 transform_utils.py:288] The following keys are not loaded from the original tree after applying specified transforms: params/params/decoder/to_nnx__rngs/aqt/count, params/params/decoder/to_nnx__rngs/aqt/key, params/params/decoder/to_nnx__rngs/dropout/count, params/params/decoder/to_nnx__rngs/dropout/key, params/params/decoder/to_nnx__rngs/params/count, params/params/decoder/to_nnx__rngs/params/key
I0424 09:21:57.057922 133324340254528 checkpointer.py:318] Finished restoring checkpoint in 2.87 seconds from gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items.
I0424 09:21:57.733512 133324340254528 train_distill.py:640] Initializing Data Iterators via MaxText pipeline...
I0424 09:21:57.798068 133324340254528 config.py:112] TensorFlow version 2.20.0 available.
I0424 09:21:57.798557 133324340254528 config.py:125] JAX version 0.8.3 available.
E0424 09:21:59.842100 133324340254528 packing.py:209] PackAndBatchOperation is deprecated. Please use lazy_dataset.FirstFitPackIterDataset instead.
I0424 09:21:59.842322 133324340254528 data_loader.py:408] Adding CopyNumPyArrayToSharedMemory MapTransform.
I0424 09:21:59.845324 133324340254528 train_distill.py:410] Input Pipeline Checkpointing: DISABLED
I0424 09:21:59.845389 133324340254528 train_distill.py:414] Reason: Iterator 'MultiHostDataLoadIterator' is not recognized as Grain (dataset_type='DatasetType.HF', has_save=False)
I0424 09:21:59.845453 133324340254528 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0424 09:21:59.845530 133324340254528 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7941466f2210>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 09:21:59.845571 133324340254528 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0424 09:21:59.845602 133324340254528 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7941466f2210>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 09:21:59.845644 133324340254528 checkpoint_manager.py:702] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=None, item_handlers={'model_params': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063380>, 'optimizer_state': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440632f0>, 'custom_metadata': <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144063260>}, handler_registry=None
I0424 09:21:59.845853 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "model_params". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063380>` for item "model_params" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 09:21:59.845893 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "optimizer_state". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440632f0>` for item "optimizer_state" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 09:21:59.845920 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "custom_metadata". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144063260>` for item "custom_metadata" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 09:21:59.845945 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792a74f65250>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 09:21:59.845971 133324340254528 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063380>, ('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063380>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440632f0>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440632f0>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144063260>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144063260>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792a74f65250>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792a74f65250>}).
I0424 09:21:59.846386 133324340254528 async_checkpointer.py:177] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x79212c081800> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0424 09:22:02.340959 133324340254528 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://lance-maxtext/pt_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321/pt_distill_linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321_07_distill_smoke/checkpoints
I0424 09:22:02.381993 133324340254528 checkpoint_manager.py:921] [process=5][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=2000, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=None, preservation_policy=None, prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://lance-maxtext/pt_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321/pt_distill_linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321_07_distill_smoke/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x792144063230>
I0424 09:22:02.382138 133324340254528 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0424 09:22:02.382214 133324340254528 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7941466f2210>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 09:22:02.382251 133324340254528 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0424 09:22:02.382283 133324340254528 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7941466f2210>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0424 09:22:02.382319 133324340254528 checkpoint_manager.py:1983] [process=5][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0424 09:22:02.382371 133324340254528 checkpoint.py:459] Closing _NonBlockingMetadataStore(enable_write=True, _write_lock=<locked _thread.RLock object owner=133324340254528 count=1 at 0x792a74a5a0c0>, _store_impl=<orbax.checkpoint._src.metadata.checkpoint._MetadataStoreImpl object at 0x792144063020>, _single_thread_executor=<concurrent.futures.thread.ThreadPoolExecutor object at 0x792144062ff0>, _write_futures=[])
I0424 09:22:02.382763 133324340254528 checkpoint.py:459] Closing _NonBlockingMetadataStore(enable_write=True, _write_lock=<locked _thread.RLock object owner=133324340254528 count=1 at 0x792a74a5a0c0>, _store_impl=<orbax.checkpoint._src.metadata.checkpoint._MetadataStoreImpl object at 0x792144063020>, _single_thread_executor=<concurrent.futures.thread.ThreadPoolExecutor object at 0x792144062ff0>, _write_futures=[])
I0424 09:22:02.382791 133324340254528 checkpoint.py:459] Closing _NonBlockingMetadataStore(enable_write=True, _write_lock=<locked _thread.RLock object owner=133324340254528 count=1 at 0x792a74a5a0c0>, _store_impl=<orbax.checkpoint._src.metadata.checkpoint._MetadataStoreImpl object at 0x792144063020>, _single_thread_executor=<concurrent.futures.thread.ThreadPoolExecutor object at 0x792144062ff0>, _write_futures=[])
I0424 09:22:02.382821 133324340254528 checkpoint_manager.py:702] [process=5][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=None, item_handlers={'model_params': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063200>, 'optimizer_state': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440626c0>, 'custom_metadata': <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792124015b80>, 'iter': <maxtext.common.checkpointing.GrainCheckpointHandler object at 0x792124016540>}, handler_registry=None
I0424 09:22:02.382926 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "model_params". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063200>` for item "model_params" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 09:22:02.382959 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "optimizer_state". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440626c0>` for item "optimizer_state" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0424 09:22:02.382983 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "custom_metadata". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792124015b80>` for item "custom_metadata" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 09:22:02.383011 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "iter". Adding handler `<maxtext.common.checkpointing.GrainCheckpointHandler object at 0x792124016540>` for item "iter" and save args `<class 'maxtext.common.checkpointing.GrainCheckpointSave'>` and restore args `<class 'maxtext.common.checkpointing.GrainCheckpointRestore'>` to `_handler_registry`.
I0424 09:22:02.383035 133324340254528 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144062b10>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0424 09:22:02.383061 133324340254528 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063200>, ('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x792144063200>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440626c0>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7921440626c0>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792124015b80>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792124015b80>, ('iter', <class 'maxtext.common.checkpointing.GrainCheckpointSave'>): <maxtext.common.checkpointing.GrainCheckpointHandler object at 0x792124016540>, ('iter', <class 'maxtext.common.checkpointing.GrainCheckpointRestore'>): <maxtext.common.checkpointing.GrainCheckpointHandler object at 0x792124016540>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144062b10>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x792144062b10>}).
I0424 09:22:02.383131 133324340254528 async_checkpointer.py:177] [process=5][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x79212c081940> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0424 09:22:02.788608 133324340254528 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://lance-maxtext/pt_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321/pt_distill_linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321_07_distill_smoke/checkpoints
I0424 09:22:02.808530 133324340254528 checkpoint_manager.py:921] [process=5][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=2000, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=None, preservation_policy=None, prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://lance-maxtext/pt_ckpt_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321/pt_distill_linen_xpk_feat_nnx_trainstate_and_training_loop_20260424_091321_07_distill_smoke/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7926f406fd10>
I0424 09:22:02.808983 133324340254528 train_distill.py:691] Starting Distillation Training...
I0424 09:22:02.809084 133324340254528 peft_trainer.py:590] Training with mesh: Mesh('diloco': 1, 'data': 4, 'stage': 1, 'fsdp': 8, 'fsdp_transpose': 1, 'sequence': 1, 'context': 1, 'context_autoregressive': 1, 'tensor': 1, 'tensor_transpose': 1, 'tensor_sequence': 1, 'expert': 1, 'autoregressive': 1, axis_types=(Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto))
I0424 09:22:03.902349 133324340254528 peft_trainer.py:600] Compiled train_step cache size: 0

Training:   0%|          | 0/5 [00:00<?, ?step/s]I0424 09:22:03.904270 133179573856000 grain_pool.py:367] Grain pool will use 1 processes.
I0424 09:22:03.930721 133179573856000 grain_pool.py:440] Grain pool will start child processes.
I0424 09:22:03.936040 133179573856000 grain_pool.py:448] Grain pool started all child processes.
2026-04-24 09:22:09.948317: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
`rope_scaling`'s factor field must be a float >= 1, got 40
`rope_scaling`'s beta_fast field must be a float, got 32
`rope_scaling`'s beta_slow field must be a float, got 1
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
I0424 09:22:12.667641 133324340254528 utils.py:86] Train loop finished in: 8.7646 seconds
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/deps/src/maxtext/trainers/post_train/distillation/train_distill.py", line 765, in <module>
    app.run(main)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 316, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 261, in _run_main
    sys.exit(main(argv))
             ^^^^^^^^^^
  File "/deps/src/maxtext/trainers/post_train/distillation/train_distill.py", line 761, in main
    train_distill(student_config, teacher_config, is_offline, global_config.offline_data_dir)
  File "/deps/src/maxtext/trainers/post_train/distillation/train_distill.py", line 693, in train_distill
    trainer.train(train_iter, eval_iter)
  File "/usr/local/lib/python3.12/site-packages/tunix/sft/peft_trainer.py", line 659, in train
    train_example = sharding_utils.shard_input(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tunix/sft/sharding_utils.py", line 58, in shard_input
    return jax.tree.map(
           ^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/tree.py", line 155, in map
    return tree_util.tree_map(f, tree, *rest, is_leaf=is_leaf)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/tree_util.py", line 362, in tree_map
    return treedef.unflatten(f(*xs) for xs in zip(*all_leaves))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/tree_util.py", line 362, in <genexpr>
    return treedef.unflatten(f(*xs) for xs in zip(*all_leaves))
                             ^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tunix/sft/sharding_utils.py", line 59, in <lambda>
    lambda x: jax.make_array_from_process_local_data(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/array.py", line 986, in make_array_from_process_local_data
    out = [_array_from_process_local_data(data, s, shape)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/array.py", line 1048, in _array_from_process_local_data
    return make_array_from_callback(global_shape, sharding, cb)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/array.py", line 845, in make_array_from_callback
    per_device_values = api.device_put(per_device_values, devices)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/api.py", line 2729, in device_put
    out_flat = dispatch._batched_device_put_impl(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/dispatch.py", line 558, in _batched_device_put_impl
    y = _device_put_impl(x, device=device, src=src, copy=cp, aval=aval)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/dispatch.py", line 545, in _device_put_impl
    return _device_put_sharding_impl(x, aval, device, copy)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/dispatch.py", line 487, in _device_put_sharding_impl
    raise ValueError(
ValueError: device_put's first argument must be a fully addressable array, but got value with devices {TpuDevice(id=14, process_index=3, coords=(2,3,0), core_on_chip=0), TpuDevice(id=26, process_index=7, coords=(2,6,0), core_on_chip=0), TpuDevice(id=24, process_index=6, coords=(0,6,0), core_on_chip=0), TpuDevice(id=19, process_index=5, coords=(3,4,0), core_on_chip=0), TpuDevice(id=15, process_index=3, coords=(3,3,0), core_on_chip=0), TpuDevice(id=1, process_index=0, coords=(1,0,0), core_on_chip=0), TpuDevice(id=11, process_index=3, coords=(3,2,0), core_on_chip=0), TpuDevice(id=18, process_index=5, coords=(2,4,0), core_on_chip=0), TpuDevice(id=13, process_index=2, coords=(1,3,0), core_on_chip=0), TpuDevice(id=23, process_index=5, coords=(3,5,0), core_on_chip=0), TpuDevice(id=27, process_index=7, coords=(3,6,0), core_on_chip=0), TpuDevice(id=16, process_index=4, coords=(0,4,0), core_on_chip=0), TpuDevice(id=30, process_index=7, coords=(2,7,0), core_on_chip=0), TpuDevice(id=10, process_index=3, coords=(2,2,0), core_on_chip=0), TpuDevice(id=25, process_index=6, coords=(1,6,0), core_on_chip=0), TpuDevice(id=7, process_index=1, coords=(3,1,0), core_on_chip=0), TpuDevice(id=17, process_index=4, coords=(1,4,0), core_on_chip=0), TpuDevice(id=4, process_index=0, coords=(0,1,0), core_on_chip=0), TpuDevice(id=31, process_index=7, coords=(3,7,0), core_on_chip=0), TpuDevice(id=28, process_index=6, coords=(0,7,0), core_on_chip=0), TpuDevice(id=6, process_index=1, coords=(2,1,0), core_on_chip=0), TpuDevice(id=8, process_index=2, coords=(0,2,0), core_on_chip=0), TpuDevice(id=3, process_index=1, coords=(3,0,0), core_on_chip=0), TpuDevice(id=0, process_index=0, coords=(0,0,0), core_on_chip=0), TpuDevice(id=29, process_index=6, coords=(1,7,0), core_on_chip=0), TpuDevice(id=5, process_index=0, coords=(1,1,0), core_on_chip=0), TpuDevice(id=21, process_index=4, coords=(1,5,0), core_on_chip=0), TpuDevice(id=9, process_index=2, coords=(1,2,0), core_on_chip=0), TpuDevice(id=2, process_index=1, coords=(2,0,0), core_on_chip=0), TpuDevice(id=20, process_index=4, coords=(0,5,0), core_on_chip=0), TpuDevice(id=22, process_index=5, coords=(2,5,0), core_on_chip=0), TpuDevice(id=12, process_index=2, coords=(0,3,0), core_on_chip=0)}
I0424 09:22:13.014961 133179573856000 grain_pool.py:542] Grain pool is exiting.
I0424 09:22:13.015069 133179573856000 grain_pool.py:547] Shutting down multiprocessing system.
I0424 09:22:14.451967 133179573856000 grain_pool.py:547] Shutting down multiprocessing system.

Training:   0%|          | 0/5 [00:12<?, ?step/s]
/usr/local/lib/python3.12/multiprocessing/resource_tracker.py:279: UserWarning: resource_tracker: There appear to be 15 leaked shared_memory objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '
XPK End: Fri Apr 24 09:22:22 UTC 2026
EXIT_CODE=1