MaxView

← Back to run

Log Summary

XPK Start: Thu Apr 23 21:27:03 UTC 2026
2026-04-23 21:27:19.965847: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
`rope_scaling`'s factor field must be a float >= 1, got 40
`rope_scaling`'s beta_fast field must be a float, got 32
`rope_scaling`'s beta_slow field must be a float, got 1
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
I0423 21:27:23.946985 139795023222592 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-23 21:27:32,986:jax._src.distributed:149: Starting JAX distributed service on [::]:8482
I0423 21:27:32.986361 139795023222592 distributed.py:149] Starting JAX distributed service on [::]:8482
INFO:2026-04-23 21:27:32,988:jax._src.distributed:166: Connecting to JAX distributed service on mt-07-distill-smoke-6ifyi-slice-job-0-0.mt-07-distill-smoke-6ifyi:8482
I0423 21:27:32.988588 139795023222592 distributed.py:166] Connecting to JAX distributed service on mt-07-distill-smoke-6ifyi-slice-job-0-0.mt-07-distill-smoke-6ifyi:8482
I0423 21:27:34.783904 139795023222592 max_utils.py:284] Jax distributed system initialized!
I0423 21:27:40.501069 139795023222592 max_utils.py:244] Jax distributed system is already initialized.
W0423 21:27:40.632510 139795023222592 pyconfig.py:111] base_output_directory is not provided; Using local directory called maxtext_output
I0423 21:27:40.693164 139795023222592 max_utils.py:244] Jax distributed system is already initialized.
I0423 21:27:40.694343 139795023222592 pyconfig.py:471] Config param abort_on_inf_loss: True
I0423 21:27:40.694389 139795023222592 pyconfig.py:471] Config param abort_on_nan_loss: True
I0423 21:27:40.694415 139795023222592 pyconfig.py:471] Config param act_quantization_calibration_method: absmax
I0423 21:27:40.694436 139795023222592 pyconfig.py:471] Config param activation_dropout_for_audio: 0.0
I0423 21:27:40.694456 139795023222592 pyconfig.py:471] Config param activation_function_for_audio: gelu
I0423 21:27:40.694474 139795023222592 pyconfig.py:471] Config param activations_in_float32: False
I0423 21:27:40.694492 139795023222592 pyconfig.py:471] Config param adam_b1: 0.9
I0423 21:27:40.694509 139795023222592 pyconfig.py:471] Config param adam_b2: 0.95
I0423 21:27:40.694527 139795023222592 pyconfig.py:471] Config param adam_eps: 1e-08
I0423 21:27:40.694549 139795023222592 pyconfig.py:471] Config param adam_eps_root: 0.0
I0423 21:27:40.694566 139795023222592 pyconfig.py:471] Config param adam_weight_decay: 0.1
I0423 21:27:40.694583 139795023222592 pyconfig.py:471] Config param adamw_mask: []
I0423 21:27:40.694600 139795023222592 pyconfig.py:471] Config param add_bos: True
I0423 21:27:40.694618 139795023222592 pyconfig.py:471] Config param add_eos: True
I0423 21:27:40.694633 139795023222592 pyconfig.py:471] Config param allow_split_physical_axes: False
I0423 21:27:40.694650 139795023222592 pyconfig.py:471] Config param ar_cache_axis_order: 1,2,0,3
I0423 21:27:40.694667 139795023222592 pyconfig.py:471] Config param async_checkpointing: True
I0423 21:27:40.694683 139795023222592 pyconfig.py:471] Config param async_scheduling: False
I0423 21:27:40.694699 139795023222592 pyconfig.py:471] Config param attention: dot_product
I0423 21:27:40.694714 139795023222592 pyconfig.py:471] Config param attention_bias: False
I0423 21:27:40.694731 139795023222592 pyconfig.py:471] Config param attention_dropout_for_audio: 0.0
I0423 21:27:40.694748 139795023222592 pyconfig.py:471] Config param attention_out: RematLocation.REMAT
I0423 21:27:40.694768 139795023222592 pyconfig.py:471] Config param attention_output_dim: -1
I0423 21:27:40.694785 139795023222592 pyconfig.py:471] Config param attention_sink: False
I0423 21:27:40.694800 139795023222592 pyconfig.py:471] Config param attention_type: global
I0423 21:27:40.694815 139795023222592 pyconfig.py:471] Config param attn_logits_soft_cap: None
I0423 21:27:40.694831 139795023222592 pyconfig.py:471] Config param audio_path: 
I0423 21:27:40.694847 139795023222592 pyconfig.py:471] Config param audio_placeholder: <|audio|>
I0423 21:27:40.694862 139795023222592 pyconfig.py:471] Config param autoregressive_decode_assert: 
I0423 21:27:40.694877 139795023222592 pyconfig.py:471] Config param base_config: base.yml
I0423 21:27:40.694894 139795023222592 pyconfig.py:471] Config param base_emb_dim: 16
I0423 21:27:40.694909 139795023222592 pyconfig.py:471] Config param base_mlp_dim: 64
I0423 21:27:40.694925 139795023222592 pyconfig.py:471] Config param base_moe_mlp_dim: -1
I0423 21:27:40.694941 139795023222592 pyconfig.py:471] Config param base_num_decoder_layers: 1
I0423 21:27:40.694955 139795023222592 pyconfig.py:471] Config param base_num_kv_heads: 2
I0423 21:27:40.694971 139795023222592 pyconfig.py:471] Config param base_num_query_heads: 2
I0423 21:27:40.694986 139795023222592 pyconfig.py:471] Config param base_output_directory: /deps/maxtext_output
I0423 21:27:40.695002 139795023222592 pyconfig.py:471] Config param batch_size: 1
I0423 21:27:40.695017 139795023222592 pyconfig.py:471] Config param batch_split_factor: 1
I0423 21:27:40.695033 139795023222592 pyconfig.py:471] Config param beta_fast: 32
I0423 21:27:40.695048 139795023222592 pyconfig.py:471] Config param beta_slow: 1
I0423 21:27:40.695064 139795023222592 pyconfig.py:471] Config param bwd_quantization_calibration_method: absmax
I0423 21:27:40.695081 139795023222592 pyconfig.py:471] Config param capacity_factor: -1.0
I0423 21:27:40.695106 139795023222592 pyconfig.py:471] Config param cast_logits_to_fp32: True
I0423 21:27:40.695122 139795023222592 pyconfig.py:471] Config param chat_template: 
I0423 21:27:40.695139 139795023222592 pyconfig.py:471] Config param chat_template_path: 
I0423 21:27:40.695157 139795023222592 pyconfig.py:471] Config param checkpoint_conversion_fn: None
I0423 21:27:40.695173 139795023222592 pyconfig.py:471] Config param checkpoint_dir: /deps/maxtext_output/gpt3-52k_2026-04-23-21-27/checkpoints/
I0423 21:27:40.695190 139795023222592 pyconfig.py:471] Config param checkpoint_is_quantized: False
I0423 21:27:40.695206 139795023222592 pyconfig.py:471] Config param checkpoint_period: 2000
I0423 21:27:40.695220 139795023222592 pyconfig.py:471] Config param checkpoint_storage_concurrent_gb: 96
I0423 21:27:40.695237 139795023222592 pyconfig.py:471] Config param checkpoint_storage_target_data_file_size_bytes: 2147483648
I0423 21:27:40.695253 139795023222592 pyconfig.py:471] Config param checkpoint_storage_use_ocdbt: True
I0423 21:27:40.695267 139795023222592 pyconfig.py:471] Config param checkpoint_storage_use_zarr3: True
I0423 21:27:40.695286 139795023222592 pyconfig.py:471] Config param checkpoint_todelete_full_path: None
I0423 21:27:40.695301 139795023222592 pyconfig.py:471] Config param checkpoint_todelete_subdir: None
I0423 21:27:40.695317 139795023222592 pyconfig.py:471] Config param chips_per_vm: 4
I0423 21:27:40.695332 139795023222592 pyconfig.py:471] Config param chunk_attn_window_size: 0
I0423 21:27:40.695349 139795023222592 pyconfig.py:471] Config param collect_stack_trace: False
I0423 21:27:40.695364 139795023222592 pyconfig.py:471] Config param colocated_python_checkpointing: False
I0423 21:27:40.695380 139795023222592 pyconfig.py:471] Config param colocated_python_data_input: False
I0423 21:27:40.695394 139795023222592 pyconfig.py:471] Config param compile_topology: 
I0423 21:27:40.695410 139795023222592 pyconfig.py:471] Config param compile_topology_num_slices: -1
I0423 21:27:40.695425 139795023222592 pyconfig.py:471] Config param compile_xla_flags: 
I0423 21:27:40.695441 139795023222592 pyconfig.py:471] Config param compiled_trainstep_file: 
I0423 21:27:40.695456 139795023222592 pyconfig.py:471] Config param compute_axis_order: 0,1,2,3
I0423 21:27:40.695470 139795023222592 pyconfig.py:471] Config param constant_bound_config: []
I0423 21:27:40.695486 139795023222592 pyconfig.py:471] Config param context: RematLocation.REMAT
I0423 21:27:40.695502 139795023222592 pyconfig.py:471] Config param context_parallel_load_balance: True
I0423 21:27:40.695517 139795023222592 pyconfig.py:471] Config param context_parallel_reorder_strategy: ReorderStrategy.AUTO
I0423 21:27:40.695534 139795023222592 pyconfig.py:471] Config param context_parallel_size: 1
I0423 21:27:40.695549 139795023222592 pyconfig.py:471] Config param context_parallel_strategy: all_gather
I0423 21:27:40.695564 139795023222592 pyconfig.py:471] Config param context_sharding: context
I0423 21:27:40.695580 139795023222592 pyconfig.py:471] Config param conv_chunksize_for_audio: 500
I0423 21:27:40.695595 139795023222592 pyconfig.py:471] Config param conv_stride_for_vit: 14
I0423 21:27:40.695610 139795023222592 pyconfig.py:471] Config param convert_checkpoint_if_possible: False
I0423 21:27:40.695625 139795023222592 pyconfig.py:471] Config param cost_estimate_flops_bwd: -1
I0423 21:27:40.695640 139795023222592 pyconfig.py:471] Config param cost_estimate_flops_fwd: -1
I0423 21:27:40.695654 139795023222592 pyconfig.py:471] Config param custom_mesh: 
I0423 21:27:40.695670 139795023222592 pyconfig.py:471] Config param custom_mesh_and_rule: 
I0423 21:27:40.695684 139795023222592 pyconfig.py:471] Config param d_model_for_audio: 256
I0423 21:27:40.695699 139795023222592 pyconfig.py:471] Config param data_sharding: (('data', 'stage', 'fsdp', 'fsdp_transpose', 'sequence', 'context', 'context_autoregressive', 'tensor', 'tensor_transpose', 'tensor_sequence', 'expert', 'autoregressive'),)
I0423 21:27:40.695719 139795023222592 pyconfig.py:471] Config param data_shuffle_seed: 0
I0423 21:27:40.695734 139795023222592 pyconfig.py:471] Config param dataset_name: c4/en:3.0.1
I0423 21:27:40.695750 139795023222592 pyconfig.py:471] Config param dataset_path: 
I0423 21:27:40.695765 139795023222592 pyconfig.py:471] Config param dataset_type: DatasetType.HF
I0423 21:27:40.695782 139795023222592 pyconfig.py:471] Config param dcn_autoregressive_parallelism: 1
I0423 21:27:40.695796 139795023222592 pyconfig.py:471] Config param dcn_context_autoregressive_parallelism: 1
I0423 21:27:40.695812 139795023222592 pyconfig.py:471] Config param dcn_context_parallelism: 1
I0423 21:27:40.695827 139795023222592 pyconfig.py:471] Config param dcn_data_parallelism: -1
I0423 21:27:40.695842 139795023222592 pyconfig.py:471] Config param dcn_diloco_parallelism: 1
I0423 21:27:40.695857 139795023222592 pyconfig.py:471] Config param dcn_expert_parallelism: 1
I0423 21:27:40.695891 139795023222592 pyconfig.py:471] Config param dcn_fsdp_parallelism: 1
I0423 21:27:40.695906 139795023222592 pyconfig.py:471] Config param dcn_fsdp_transpose_parallelism: 1
I0423 21:27:40.695922 139795023222592 pyconfig.py:471] Config param dcn_parallelism: [1, -1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
I0423 21:27:40.695938 139795023222592 pyconfig.py:471] Config param dcn_pipeline_parallelism: 1
I0423 21:27:40.695954 139795023222592 pyconfig.py:471] Config param dcn_sequence_parallelism: 1
I0423 21:27:40.695970 139795023222592 pyconfig.py:471] Config param dcn_tensor_parallelism: 1
I0423 21:27:40.695985 139795023222592 pyconfig.py:471] Config param dcn_tensor_sequence_parallelism: 1
I0423 21:27:40.696000 139795023222592 pyconfig.py:471] Config param dcn_tensor_transpose_parallelism: 1
I0423 21:27:40.696015 139795023222592 pyconfig.py:471] Config param debug: {'rl': False}
I0423 21:27:40.696031 139795023222592 pyconfig.py:471] Config param debug_sharding: False
I0423 21:27:40.696045 139795023222592 pyconfig.py:471] Config param decode_sampling_nucleus_p: -1
I0423 21:27:40.696061 139795023222592 pyconfig.py:471] Config param decode_sampling_strategy: SamplingStrategy.GREEDY
I0423 21:27:40.696079 139795023222592 pyconfig.py:471] Config param decode_sampling_temperature: 1.0
I0423 21:27:40.696102 139795023222592 pyconfig.py:471] Config param decode_sampling_top_k: 0
I0423 21:27:40.696117 139795023222592 pyconfig.py:471] Config param decoder_block: DecoderBlockType.GPT3
I0423 21:27:40.696135 139795023222592 pyconfig.py:471] Config param decoder_layer_input: RematLocation.DEVICE
I0423 21:27:40.696151 139795023222592 pyconfig.py:471] Config param deepstack_visual_indexes_for_vit: []
I0423 21:27:40.696167 139795023222592 pyconfig.py:471] Config param degenerate_group_masking: True
I0423 21:27:40.696181 139795023222592 pyconfig.py:471] Config param dense_init_scale: 1.0
I0423 21:27:40.696197 139795023222592 pyconfig.py:471] Config param diloco_outer_lr: 0.3
I0423 21:27:40.696212 139795023222592 pyconfig.py:471] Config param diloco_outer_momentum: 0.9
I0423 21:27:40.696228 139795023222592 pyconfig.py:471] Config param diloco_sync_period: 36
I0423 21:27:40.696244 139795023222592 pyconfig.py:471] Config param distill_alpha: 0.5
I0423 21:27:40.696260 139795023222592 pyconfig.py:471] Config param distill_alpha_end: None
I0423 21:27:40.696274 139795023222592 pyconfig.py:471] Config param distill_alpha_schedule: constant
I0423 21:27:40.696293 139795023222592 pyconfig.py:471] Config param distill_beta: 0.0
I0423 21:27:40.696309 139795023222592 pyconfig.py:471] Config param distill_beta_end: None
I0423 21:27:40.696324 139795023222592 pyconfig.py:471] Config param distill_beta_schedule: constant
I0423 21:27:40.696339 139795023222592 pyconfig.py:471] Config param distill_feature_loss_type: cosine
I0423 21:27:40.696354 139795023222592 pyconfig.py:471] Config param distill_layer_indices: None
I0423 21:27:40.696370 139795023222592 pyconfig.py:471] Config param distill_temperature: 1.0
I0423 21:27:40.696384 139795023222592 pyconfig.py:471] Config param distill_temperature_end: None
I0423 21:27:40.696400 139795023222592 pyconfig.py:471] Config param distill_temperature_schedule: constant
I0423 21:27:40.696415 139795023222592 pyconfig.py:471] Config param downsample_hidden_size_for_audio: 256
I0423 21:27:40.696429 139795023222592 pyconfig.py:471] Config param dpo_beta: 0.1
I0423 21:27:40.696445 139795023222592 pyconfig.py:471] Config param dpo_label_smoothing: 0.0
I0423 21:27:40.696461 139795023222592 pyconfig.py:471] Config param dq_reduction_steps: 0
I0423 21:27:40.696477 139795023222592 pyconfig.py:471] Config param dropout_rate: 0.0
I0423 21:27:40.696493 139795023222592 pyconfig.py:471] Config param dtype: bfloat16
I0423 21:27:40.696522 139795023222592 pyconfig.py:471] Config param dtype_mm: float32
I0423 21:27:40.696537 139795023222592 pyconfig.py:471] Config param dump_hlo: False
I0423 21:27:40.696553 139795023222592 pyconfig.py:471] Config param dump_hlo_delete_local_after: True
I0423 21:27:40.696567 139795023222592 pyconfig.py:471] Config param dump_hlo_gcs_dir: /deps/maxtext_output/gpt3-52k_2026-04-23-21-27/xla_dump
I0423 21:27:40.696583 139795023222592 pyconfig.py:471] Config param dump_hlo_local_dir: /tmp/xla_dump/
I0423 21:27:40.696597 139795023222592 pyconfig.py:471] Config param dump_hlo_local_module_name: jit_train_step
I0423 21:27:40.696613 139795023222592 pyconfig.py:471] Config param dump_hlo_module_name: jit_train_step
I0423 21:27:40.696629 139795023222592 pyconfig.py:471] Config param dump_hlo_upload_all: False
I0423 21:27:40.696645 139795023222592 pyconfig.py:471] Config param dump_hlo_xla_flags: 
I0423 21:27:40.696660 139795023222592 pyconfig.py:471] Config param dump_jaxpr: False
I0423 21:27:40.696675 139795023222592 pyconfig.py:471] Config param dump_jaxpr_delete_local_after: True
I0423 21:27:40.696690 139795023222592 pyconfig.py:471] Config param dump_jaxpr_gcs_dir: /deps/maxtext_output/gpt3-52k_2026-04-23-21-27/jaxpr_dump
I0423 21:27:40.696705 139795023222592 pyconfig.py:471] Config param dump_jaxpr_local_dir: /tmp/jaxpr_dump/
I0423 21:27:40.696720 139795023222592 pyconfig.py:471] Config param dump_step: -1
I0423 21:27:40.696735 139795023222592 pyconfig.py:471] Config param elastic_enabled: False
I0423 21:27:40.696751 139795023222592 pyconfig.py:471] Config param elastic_max_retries: 10
I0423 21:27:40.696765 139795023222592 pyconfig.py:471] Config param elastic_timeout_seconds: 300
I0423 21:27:40.696780 139795023222592 pyconfig.py:471] Config param emb_dim: 16
I0423 21:27:40.696796 139795023222592 pyconfig.py:471] Config param enable_autocheckpoint: False
I0423 21:27:40.696810 139795023222592 pyconfig.py:471] Config param enable_checkpoint_cloud_logger: False
I0423 21:27:40.696825 139795023222592 pyconfig.py:471] Config param enable_checkpointing: True
I0423 21:27:40.696840 139795023222592 pyconfig.py:471] Config param enable_continuous_checkpointing: False
I0423 21:27:40.696855 139795023222592 pyconfig.py:471] Config param enable_data_shuffling: True
I0423 21:27:40.696871 139795023222592 pyconfig.py:471] Config param enable_diloco: False
I0423 21:27:40.696885 139795023222592 pyconfig.py:471] Config param enable_dp_attention: False
I0423 21:27:40.696900 139795023222592 pyconfig.py:471] Config param enable_dropout: False
I0423 21:27:40.696914 139795023222592 pyconfig.py:471] Config param enable_emergency_checkpoint: False
I0423 21:27:40.696929 139795023222592 pyconfig.py:471] Config param enable_expert_parallel: False
I0423 21:27:40.696945 139795023222592 pyconfig.py:471] Config param enable_gcp_goodput_metrics: True
I0423 21:27:40.696960 139795023222592 pyconfig.py:471] Config param enable_gcp_step_deviation_metrics: True
I0423 21:27:40.696974 139795023222592 pyconfig.py:471] Config param enable_goodput_recording: False
I0423 21:27:40.696990 139795023222592 pyconfig.py:471] Config param enable_jax_profiler: False
I0423 21:27:40.697005 139795023222592 pyconfig.py:471] Config param enable_llm_inference_pool: False
I0423 21:27:40.697021 139795023222592 pyconfig.py:471] Config param enable_model_warmup: False
I0423 21:27:40.697038 139795023222592 pyconfig.py:471] Config param enable_multi_tier_checkpointing: False
I0423 21:27:40.697053 139795023222592 pyconfig.py:471] Config param enable_nnx: False
I0423 21:27:40.697070 139795023222592 pyconfig.py:471] Config param enable_orbax_v1: False
I0423 21:27:40.697086 139795023222592 pyconfig.py:471] Config param enable_padding_causal_mask: True
I0423 21:27:40.697110 139795023222592 pyconfig.py:471] Config param enable_pathways_goodput: False
I0423 21:27:40.697126 139795023222592 pyconfig.py:471] Config param enable_prefix_caching: False
I0423 21:27:40.697142 139795023222592 pyconfig.py:471] Config param enable_rampup_batch_size: False
I0423 21:27:40.697156 139795023222592 pyconfig.py:471] Config param enable_single_controller: False
I0423 21:27:40.697171 139795023222592 pyconfig.py:471] Config param enable_single_replica_ckpt_restoring: False
I0423 21:27:40.697187 139795023222592 pyconfig.py:471] Config param enable_tensorboard: True
I0423 21:27:40.697201 139795023222592 pyconfig.py:471] Config param enable_tunix_perf_metrics: False
I0423 21:27:40.697216 139795023222592 pyconfig.py:471] Config param encoder_attention_heads_for_audio: 4
I0423 21:27:40.697231 139795023222592 pyconfig.py:471] Config param encoder_ffn_dim_for_audio: 512
I0423 21:27:40.697247 139795023222592 pyconfig.py:471] Config param encoder_layers_for_audio: 2
I0423 21:27:40.697262 139795023222592 pyconfig.py:471] Config param engram: RematLocation.REMAT
I0423 21:27:40.697278 139795023222592 pyconfig.py:471] Config param engram_head_dim: 1280
I0423 21:27:40.697298 139795023222592 pyconfig.py:471] Config param engram_kernel_size: 4
I0423 21:27:40.697312 139795023222592 pyconfig.py:471] Config param engram_layers: []
I0423 21:27:40.697327 139795023222592 pyconfig.py:471] Config param engram_max_ngram_size: 3
I0423 21:27:40.697343 139795023222592 pyconfig.py:471] Config param engram_num_heads: 8
I0423 21:27:40.697357 139795023222592 pyconfig.py:471] Config param engram_seed: 0
I0423 21:27:40.697372 139795023222592 pyconfig.py:471] Config param engram_vocab_bases: []
I0423 21:27:40.697388 139795023222592 pyconfig.py:471] Config param epsilon_high: None
I0423 21:27:40.697403 139795023222592 pyconfig.py:471] Config param eval_corr_lst: False
I0423 21:27:40.697419 139795023222592 pyconfig.py:471] Config param eval_data_columns: ['text']
I0423 21:27:40.697433 139795023222592 pyconfig.py:471] Config param eval_dataset_name: c4/en:3.0.1
I0423 21:27:40.697449 139795023222592 pyconfig.py:471] Config param eval_image_column: image
I0423 21:27:40.697464 139795023222592 pyconfig.py:471] Config param eval_interval: -1
I0423 21:27:40.697479 139795023222592 pyconfig.py:471] Config param eval_make_lst: False
I0423 21:27:40.697494 139795023222592 pyconfig.py:471] Config param eval_per_device_batch_size: 2
I0423 21:27:40.697508 139795023222592 pyconfig.py:471] Config param eval_sampling_strategy: greedy
I0423 21:27:40.697523 139795023222592 pyconfig.py:471] Config param eval_split: validation
I0423 21:27:40.697538 139795023222592 pyconfig.py:471] Config param eval_steps: -1
I0423 21:27:40.697553 139795023222592 pyconfig.py:471] Config param expansion_factor_real_data: -1.0
I0423 21:27:40.697569 139795023222592 pyconfig.py:471] Config param final_logits_soft_cap: None
I0423 21:27:40.697583 139795023222592 pyconfig.py:471] Config param first_num_dense_layers: 0
I0423 21:27:40.697598 139795023222592 pyconfig.py:471] Config param float32_gate_logits: False
I0423 21:27:40.697613 139795023222592 pyconfig.py:471] Config param float32_logits: False
I0423 21:27:40.697628 139795023222592 pyconfig.py:471] Config param float32_qk_product: False
I0423 21:27:40.697643 139795023222592 pyconfig.py:471] Config param float32_weight_sum: True
I0423 21:27:40.697658 139795023222592 pyconfig.py:471] Config param force_q_layout: False
I0423 21:27:40.697672 139795023222592 pyconfig.py:471] Config param force_unroll: False
I0423 21:27:40.697688 139795023222592 pyconfig.py:471] Config param freeze_audio_encoder_params: True
I0423 21:27:40.697702 139795023222592 pyconfig.py:471] Config param freeze_vision_encoder_params: True
I0423 21:27:40.697717 139795023222592 pyconfig.py:471] Config param fused_mlp: False
I0423 21:27:40.697733 139795023222592 pyconfig.py:471] Config param fused_qkv: True
I0423 21:27:40.697747 139795023222592 pyconfig.py:471] Config param gcs_metrics: False
I0423 21:27:40.697762 139795023222592 pyconfig.py:471] Config param gdn_chunk_size: 64
I0423 21:27:40.697778 139795023222592 pyconfig.py:471] Config param gdn_conv_kernel_dim: 4
I0423 21:27:40.697792 139795023222592 pyconfig.py:471] Config param gdn_key_head_dim: 128
I0423 21:27:40.697808 139795023222592 pyconfig.py:471] Config param gdn_num_key_heads: 16
I0423 21:27:40.697824 139795023222592 pyconfig.py:471] Config param gdn_num_value_heads: 32
I0423 21:27:40.697840 139795023222592 pyconfig.py:471] Config param gdn_value_head_dim: 128
I0423 21:27:40.697855 139795023222592 pyconfig.py:471] Config param generate_padding_batch_eval: False
I0423 21:27:40.697870 139795023222592 pyconfig.py:471] Config param generate_padding_batch_train: False
I0423 21:27:40.697886 139795023222592 pyconfig.py:471] Config param generate_slice: v5e-16
I0423 21:27:40.697901 139795023222592 pyconfig.py:471] Config param generation_configs: {}
I0423 21:27:40.697916 139795023222592 pyconfig.py:471] Config param global_batch_size_to_eval_on: 64
I0423 21:27:40.697931 139795023222592 pyconfig.py:471] Config param global_batch_size_to_load: 512
I0423 21:27:40.697946 139795023222592 pyconfig.py:471] Config param global_batch_size_to_load_eval: 64
I0423 21:27:40.697960 139795023222592 pyconfig.py:471] Config param global_batch_size_to_load_increment: None
I0423 21:27:40.697976 139795023222592 pyconfig.py:471] Config param global_batch_size_to_load_start: None
I0423 21:27:40.697991 139795023222592 pyconfig.py:471] Config param global_batch_size_to_train_on: 512
I0423 21:27:40.698006 139795023222592 pyconfig.py:471] Config param global_head_dim: 0
I0423 21:27:40.698021 139795023222592 pyconfig.py:471] Config param global_num_kv_heads: 0
I0423 21:27:40.698036 139795023222592 pyconfig.py:471] Config param global_parameter_scale: 1
I0423 21:27:40.698051 139795023222592 pyconfig.py:471] Config param global_rampup_samples: 500
I0423 21:27:40.698066 139795023222592 pyconfig.py:471] Config param global_rope_max_timescale: -1
I0423 21:27:40.698081 139795023222592 pyconfig.py:471] Config param global_rope_proportion: 0.25
I0423 21:27:40.698105 139795023222592 pyconfig.py:471] Config param goodput_upload_interval_seconds: 30
I0423 21:27:40.698121 139795023222592 pyconfig.py:471] Config param grad_dtype: float32
I0423 21:27:40.698154 139795023222592 pyconfig.py:471] Config param gradient_accumulation_steps: 8
I0423 21:27:40.698170 139795023222592 pyconfig.py:471] Config param gradient_clipping_threshold: 1.0
I0423 21:27:40.698186 139795023222592 pyconfig.py:471] Config param grain_data_source_max_workers: 16
I0423 21:27:40.698200 139795023222592 pyconfig.py:471] Config param grain_eval_files: 
I0423 21:27:40.698216 139795023222592 pyconfig.py:471] Config param grain_file_type: arrayrecord
I0423 21:27:40.698231 139795023222592 pyconfig.py:471] Config param grain_num_threads: 16
I0423 21:27:40.698247 139795023222592 pyconfig.py:471] Config param grain_num_threads_eval: 16
I0423 21:27:40.698262 139795023222592 pyconfig.py:471] Config param grain_packing_type: first_fit
I0423 21:27:40.698278 139795023222592 pyconfig.py:471] Config param grain_per_worker_buffer_size: 1
I0423 21:27:40.698298 139795023222592 pyconfig.py:471] Config param grain_per_worker_buffer_size_eval: 1
I0423 21:27:40.698312 139795023222592 pyconfig.py:471] Config param grain_prefetch_buffer_size: 500
I0423 21:27:40.698328 139795023222592 pyconfig.py:471] Config param grain_prefetch_buffer_size_eval: 500
I0423 21:27:40.698343 139795023222592 pyconfig.py:471] Config param grain_ram_budget_mb: 1024
I0423 21:27:40.698357 139795023222592 pyconfig.py:471] Config param grain_shuffle_buffer_size: 100
I0423 21:27:40.698372 139795023222592 pyconfig.py:471] Config param grain_train_files: 
I0423 21:27:40.698387 139795023222592 pyconfig.py:471] Config param grain_train_mixture_config_path: 
I0423 21:27:40.698402 139795023222592 pyconfig.py:471] Config param grain_worker_count: 1
I0423 21:27:40.698418 139795023222592 pyconfig.py:471] Config param grain_worker_count_eval: 1
I0423 21:27:40.698432 139795023222592 pyconfig.py:471] Config param grpo_beta: 0.08
I0423 21:27:40.698449 139795023222592 pyconfig.py:471] Config param grpo_epsilon: 0.2
I0423 21:27:40.698463 139795023222592 pyconfig.py:471] Config param hardware: tpu
I0423 21:27:40.698479 139795023222592 pyconfig.py:471] Config param hbm_utilization_vllm: 0.72
I0423 21:27:40.698494 139795023222592 pyconfig.py:471] Config param head_dim: 8
I0423 21:27:40.698510 139795023222592 pyconfig.py:471] Config param heartbeat_reporting_interval_in_seconds: 5
I0423 21:27:40.698525 139795023222592 pyconfig.py:471] Config param hf_data_dir: None
I0423 21:27:40.698541 139795023222592 pyconfig.py:471] Config param hf_eval_files: None
I0423 21:27:40.698556 139795023222592 pyconfig.py:471] Config param hf_eval_split: None
I0423 21:27:40.698571 139795023222592 pyconfig.py:471] Config param hf_name: None
I0423 21:27:40.698586 139795023222592 pyconfig.py:471] Config param hf_path: OptimalScale/ClimbMix
I0423 21:27:40.698602 139795023222592 pyconfig.py:471] Config param hf_train_files: None
I0423 21:27:40.698616 139795023222592 pyconfig.py:471] Config param hidden_size_for_vit: 1408
I0423 21:27:40.698632 139795023222592 pyconfig.py:471] Config param hide_profiler_step_metric: False
I0423 21:27:40.698646 139795023222592 pyconfig.py:471] Config param ici_autoregressive_parallelism: 1
I0423 21:27:40.698661 139795023222592 pyconfig.py:471] Config param ici_context_autoregressive_parallelism: 1
I0423 21:27:40.698676 139795023222592 pyconfig.py:471] Config param ici_context_parallelism: 1
I0423 21:27:40.698692 139795023222592 pyconfig.py:471] Config param ici_data_parallelism: 1
I0423 21:27:40.698708 139795023222592 pyconfig.py:471] Config param ici_diloco_parallelism: 1
I0423 21:27:40.698722 139795023222592 pyconfig.py:471] Config param ici_expert_parallelism: 1
I0423 21:27:40.698738 139795023222592 pyconfig.py:471] Config param ici_fsdp_parallelism: -1
I0423 21:27:40.698754 139795023222592 pyconfig.py:471] Config param ici_fsdp_transpose_parallelism: 1
I0423 21:27:40.698769 139795023222592 pyconfig.py:471] Config param ici_parallelism: [1, 1, 1, -1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
I0423 21:27:40.698785 139795023222592 pyconfig.py:471] Config param ici_pipeline_parallelism: 1
I0423 21:27:40.698799 139795023222592 pyconfig.py:471] Config param ici_sequence_parallelism: 1
I0423 21:27:40.698814 139795023222592 pyconfig.py:471] Config param ici_tensor_parallelism: 1
I0423 21:27:40.698829 139795023222592 pyconfig.py:471] Config param ici_tensor_sequence_parallelism: 1
I0423 21:27:40.698844 139795023222592 pyconfig.py:471] Config param ici_tensor_transpose_parallelism: 1
I0423 21:27:40.698858 139795023222592 pyconfig.py:471] Config param image_path: 
I0423 21:27:40.698873 139795023222592 pyconfig.py:471] Config param image_placeholder: <|image|>
I0423 21:27:40.698888 139795023222592 pyconfig.py:471] Config param image_size_for_vit: 896
I0423 21:27:40.698903 139795023222592 pyconfig.py:471] Config param indexer_head_dim: 128
I0423 21:27:40.698918 139795023222592 pyconfig.py:471] Config param indexer_loss_scaling_factor: 0.0
I0423 21:27:40.698934 139795023222592 pyconfig.py:471] Config param indexer_n_heads: 64
I0423 21:27:40.698948 139795023222592 pyconfig.py:471] Config param indexer_sparse_training: False
I0423 21:27:40.698963 139795023222592 pyconfig.py:471] Config param indexer_topk: 2048
I0423 21:27:40.698977 139795023222592 pyconfig.py:471] Config param inference_benchmark_test: False
I0423 21:27:40.698992 139795023222592 pyconfig.py:471] Config param inference_metadata_file: 
I0423 21:27:40.699008 139795023222592 pyconfig.py:471] Config param inference_microbenchmark_log_file_path: 
I0423 21:27:40.699023 139795023222592 pyconfig.py:471] Config param inference_microbenchmark_loop_iters: 10
I0423 21:27:40.699039 139795023222592 pyconfig.py:471] Config param inference_microbenchmark_num_samples: [1, 2, 3, 4, 5]
I0423 21:27:40.699053 139795023222592 pyconfig.py:471] Config param inference_microbenchmark_prefill_lengths: 64,128,256,512,1024
I0423 21:27:40.699069 139795023222592 pyconfig.py:471] Config param inference_microbenchmark_stages: prefill,generate
I0423 21:27:40.699083 139795023222592 pyconfig.py:471] Config param inference_server: MaxtextInterleavedServer
I0423 21:27:40.699111 139795023222592 pyconfig.py:471] Config param inhomogeneous_layer_cycle_interval: 1
I0423 21:27:40.699128 139795023222592 pyconfig.py:471] Config param init_weights_seed: 0
I0423 21:27:40.699142 139795023222592 pyconfig.py:471] Config param input_data_sharding_logical_axes: ['activation_embed_and_logits_batch', 'activation_norm_length']
I0423 21:27:40.699160 139795023222592 pyconfig.py:471] Config param interleave_moe_layer_step: 1
I0423 21:27:40.699174 139795023222592 pyconfig.py:471] Config param intermediate_size_for_vit: 5632
I0423 21:27:40.699189 139795023222592 pyconfig.py:471] Config param internal_compile: False
I0423 21:27:40.699205 139795023222592 pyconfig.py:471] Config param internal_compile_num_devices: -1
I0423 21:27:40.699221 139795023222592 pyconfig.py:471] Config param jax_cache_dir: ~/jax_cache
I0423 21:27:40.699236 139795023222592 pyconfig.py:471] Config param jax_debug_log_modules: 
I0423 21:27:40.699252 139795023222592 pyconfig.py:471] Config param jax_distributed_initialization_timeout: 300
I0423 21:27:40.699268 139795023222592 pyconfig.py:471] Config param jax_profiler_port: 9999
I0423 21:27:40.699287 139795023222592 pyconfig.py:471] Config param key_proj: RematLocation.REMAT
I0423 21:27:40.699305 139795023222592 pyconfig.py:471] Config param kv_cache_buffer: 256
I0423 21:27:40.699319 139795023222592 pyconfig.py:471] Config param kv_lora_rank: 512
I0423 21:27:40.699335 139795023222592 pyconfig.py:471] Config param kv_quant_axis: KvQuantAxis.HEADS_AND_DKV
I0423 21:27:40.699351 139795023222592 pyconfig.py:471] Config param kv_quant_dtype: int8
I0423 21:27:40.699366 139795023222592 pyconfig.py:471] Config param kv_wa_proj: RematLocation.REMAT
I0423 21:27:40.699382 139795023222592 pyconfig.py:471] Config param learning_rate: 0.0002
I0423 21:27:40.699397 139795023222592 pyconfig.py:471] Config param learning_rate_final_fraction: 0.1
I0423 21:27:40.699413 139795023222592 pyconfig.py:471] Config param learning_rate_schedule_steps: 200000
I0423 21:27:40.699428 139795023222592 pyconfig.py:471] Config param load_balance_loss_weight: 0.0
I0423 21:27:40.699444 139795023222592 pyconfig.py:471] Config param load_checkpoint_only_once: False
I0423 21:27:40.699458 139795023222592 pyconfig.py:471] Config param load_from_prefill_dir: False
I0423 21:27:40.699473 139795023222592 pyconfig.py:471] Config param load_full_state_path: 
I0423 21:27:40.699489 139795023222592 pyconfig.py:471] Config param load_parameters_path: gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items
I0423 21:27:40.699504 139795023222592 pyconfig.py:471] Config param local_checkpoint_directory: 
I0423 21:27:40.699519 139795023222592 pyconfig.py:471] Config param local_checkpoint_period: 0
I0423 21:27:40.699534 139795023222592 pyconfig.py:471] Config param local_rope_max_timescale: -1
I0423 21:27:40.699548 139795023222592 pyconfig.py:471] Config param local_rope_proportion: 1.0
I0423 21:27:40.699564 139795023222592 pyconfig.py:471] Config param log_config: True
I0423 21:27:40.699579 139795023222592 pyconfig.py:471] Config param log_period: 10
I0423 21:27:40.699593 139795023222592 pyconfig.py:471] Config param logical_axis_rules: (('activation_embed_and_logits_batch', ('data', 'stage', 'fsdp', 'fsdp_transpose', 'expert')), ('activation_embed_and_logits_batch_sequence', ('data', 'stage', 'fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('activation_vocab', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_vocab', ('tensor', 'tensor_transpose')), ('activation_vocab', 'tensor_sequence'), ('activation_vocab', ('sequence', 'context')), ('vocab', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('embed_vocab', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('activation_heads', ('tensor', 'tensor_transpose', 'sequence', 'tensor_sequence', 'autoregressive')), ('activation_kv_heads', ('tensor', 'tensor_transpose', 'sequence', 'tensor_sequence')), ('activation_attn_length', ('sequence', 'context')), ('activation_attn_length', ('context',)), ('activation_q_length', ('context',)), ('activation_kv_length', ()), ('activation_attn_embed', ('tensor', 'tensor_transpose')), ('activation_kv', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_kv_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('activation_kv_head_dim', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('heads', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('q_heads', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('kv_heads', ('tensor', 'tensor_transpose', 'tensor_sequence', 'autoregressive')), ('qkv', ()), ('kv', ()), ('kv_head_dim', ()), ('q_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'tensor_transpose', 'expert')), ('q_lora', ('fsdp', 'sequence', 'context', 'tensor_transpose', 'expert')), ('q_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('q_lora', ('fsdp', 'sequence', 'context', 'expert')), ('q_lora_up_proj', ()), ('kv_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'tensor_transpose', 'expert')), ('kv_lora', ('fsdp', 'sequence', 'context', 'tensor_transpose', 'expert')), ('kv_lora', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('kv_lora', ('fsdp', 'sequence', 'context', 'expert')), ('kv_lora_up_proj', ()), ('activation_batch_moe', ('data', 'fsdp', 'fsdp_transpose')), ('activation_length_moe', ('sequence', 'context')), ('activation_length_moe', ('context',)), ('activation_norm_length_moe', ('tensor_sequence', 'context', 'sequence')), ('activation_embed_moe', ('tensor', 'tensor_transpose')), ('activation_mlp_moe', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_exp', ('expert',)), ('exp', 'expert'), ('mlp_moe', ('fsdp_transpose', 'tensor', 'tensor_sequence', 'autoregressive')), ('embed_moe', ('fsdp', 'fsdp_transpose', 'sequence', 'tensor_transpose', 'context')), ('embed_moe', ('fsdp', 'sequence', 'tensor_transpose', 'context')), ('embed_moe', ('fsdp', 'fsdp_transpose', 'sequence', 'context')), ('embed_moe', ('fsdp', 'sequence', 'context')), ('activation_mlp', ('tensor', 'tensor_transpose', 'tensor_sequence')), ('activation_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('activation_length', ('sequence', 'context')), ('activation_length', ('context',)), ('activation_norm_length', ('tensor_sequence', 'context', 'sequence')), ('activation_embed', ('tensor', 'tensor_transpose')), ('activation_stage', 'stage'), ('mlp', ('fsdp_transpose', 'tensor', 'tensor_sequence', 'autoregressive')), ('embed', ('fsdp', 'fsdp_transpose', 'sequence', 'tensor_transpose', 'context', 'expert')), ('embed', ('fsdp', 'sequence', 'tensor_transpose', 'context', 'expert')), ('embed', ('fsdp', 'fsdp_transpose', 'sequence', 'context', 'expert')), ('embed', ('fsdp', 'sequence', 'context', 'expert')), ('norm', ('tensor', 'tensor_transpose')), ('layers', 'stage'), ('diloco', 'diloco'), ('engram_dim', ('tensor',)), ('dense_layers', ()), ('moe_layers', ()), ('mhc', ()), ('prefill_activation_length', ('sequence', 'context')), ('prefill_activation_norm_length', ('tensor_sequence', 'context', 'sequence')), ('activation_prefill_kv_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('decode_batch', ('data', 'fsdp', 'fsdp_transpose', 'expert')), ('decode_length', ('sequence',)), ('cache_heads', ('autoregressive', 'tensor', 'tensor_transpose', 'tensor_sequence')), ('cache_heads', ('autoregressive', 'tensor', 'tensor_sequence')), ('paged_kv_heads', ('tensor',)), ('cache_batch_prefill', ()), ('cache_batch', ()), ('cache_heads_none', ()), ('cache_kv', ()), ('cache_sequence', ()), ('num_pages', ()), ('tokens_per_page', ()), ('paged_kv_head_dim_size', ()), ('mlp_no_fsdp', ('tensor', 'tensor_sequence', 'autoregressive')), ('embed_tensor_transpose', ('tensor_transpose',)), ('exp_with_fsdp', 'fsdp'))
I0423 21:27:40.699668 139795023222592 pyconfig.py:471] Config param logits_dot_in_fp32: False
I0423 21:27:40.699686 139795023222592 pyconfig.py:471] Config param logits_via_embedding: True
I0423 21:27:40.699702 139795023222592 pyconfig.py:471] Config param lora_input_adapters_path: 
I0423 21:27:40.699717 139795023222592 pyconfig.py:471] Config param loss_algo: grpo
I0423 21:27:40.699733 139795023222592 pyconfig.py:471] Config param lr_schedule_type: LearningRateScheduleType.COSINE
I0423 21:27:40.699750 139795023222592 pyconfig.py:471] Config param managed_mldiagnostics: False
I0423 21:27:40.699765 139795023222592 pyconfig.py:471] Config param managed_mldiagnostics_dir: /deps/maxtext_output/gpt3-52k_2026-04-23-21-27/managed-mldiagnostics
I0423 21:27:40.699779 139795023222592 pyconfig.py:471] Config param managed_mldiagnostics_run_group: 
I0423 21:27:40.699795 139795023222592 pyconfig.py:471] Config param matmul_precision: MatmulPrecision.DEFAULT
I0423 21:27:40.699813 139795023222592 pyconfig.py:471] Config param max_checkify: False
I0423 21:27:40.699830 139795023222592 pyconfig.py:471] Config param max_concurrency: 256
I0423 21:27:40.699844 139795023222592 pyconfig.py:471] Config param max_corpus_chars: 10000000
I0423 21:27:40.699860 139795023222592 pyconfig.py:471] Config param max_num_batched_tokens: None
I0423 21:27:40.699874 139795023222592 pyconfig.py:471] Config param max_num_checkpoints_to_keep: None
I0423 21:27:40.699889 139795023222592 pyconfig.py:471] Config param max_num_images_per_example: -1
I0423 21:27:40.699905 139795023222592 pyconfig.py:471] Config param max_num_seqs: None
I0423 21:27:40.699920 139795023222592 pyconfig.py:471] Config param max_position_embeddings: 163840
I0423 21:27:40.699935 139795023222592 pyconfig.py:471] Config param max_prefill_predict_length: 64
I0423 21:27:40.699950 139795023222592 pyconfig.py:471] Config param max_sample_len_for_audio: 10000
I0423 21:27:40.699965 139795023222592 pyconfig.py:471] Config param max_segments_per_seq: -1
I0423 21:27:40.699980 139795023222592 pyconfig.py:471] Config param max_source_positions_for_audio: 1500
I0423 21:27:40.699995 139795023222592 pyconfig.py:471] Config param max_target_length: 2048
I0423 21:27:40.700010 139795023222592 pyconfig.py:471] Config param max_timescale_for_audio: 10000.0
I0423 21:27:40.700026 139795023222592 pyconfig.py:471] Config param megablox: True
I0423 21:27:40.700041 139795023222592 pyconfig.py:471] Config param merge_gating_gmm: False
I0423 21:27:40.700056 139795023222592 pyconfig.py:471] Config param mesh_axes: ['diloco', 'data', 'stage', 'fsdp', 'fsdp_transpose', 'sequence', 'context', 'context_autoregressive', 'tensor', 'tensor_transpose', 'tensor_sequence', 'expert', 'autoregressive']
I0423 21:27:40.700074 139795023222592 pyconfig.py:471] Config param metrics_dir: /deps/maxtext_output/gpt3-52k_2026-04-23-21-27/metrics/
I0423 21:27:40.700090 139795023222592 pyconfig.py:471] Config param metrics_file: 
I0423 21:27:40.700117 139795023222592 pyconfig.py:471] Config param mhc_expansion_rate: 1
I0423 21:27:40.700132 139795023222592 pyconfig.py:471] Config param micro_batch_size_to_eval_on: 64
I0423 21:27:40.700147 139795023222592 pyconfig.py:471] Config param micro_batch_size_to_train_on: 64
I0423 21:27:40.700162 139795023222592 pyconfig.py:471] Config param mla_kv: RematLocation.REMAT
I0423 21:27:40.700178 139795023222592 pyconfig.py:471] Config param mla_naive_kvcache: True
I0423 21:27:40.700192 139795023222592 pyconfig.py:471] Config param mla_q: RematLocation.REMAT
I0423 21:27:40.700210 139795023222592 pyconfig.py:471] Config param mlp_activations: ['gelu']
I0423 21:27:40.700225 139795023222592 pyconfig.py:471] Config param mlp_activations_limit: -1.0
I0423 21:27:40.700242 139795023222592 pyconfig.py:471] Config param mlp_bias: False
I0423 21:27:40.700257 139795023222592 pyconfig.py:471] Config param mlp_dim: 64
I0423 21:27:40.700272 139795023222592 pyconfig.py:471] Config param mlpwi: RematLocation.REMAT
I0423 21:27:40.700291 139795023222592 pyconfig.py:471] Config param mlpwi_0: RematLocation.REMAT
I0423 21:27:40.700308 139795023222592 pyconfig.py:471] Config param mlpwi_1: RematLocation.REMAT
I0423 21:27:40.700323 139795023222592 pyconfig.py:471] Config param mlpwo: RematLocation.REMAT
I0423 21:27:40.700340 139795023222592 pyconfig.py:471] Config param moba: False
I0423 21:27:40.700355 139795023222592 pyconfig.py:471] Config param moba_chunk_size: 1024
I0423 21:27:40.700371 139795023222592 pyconfig.py:471] Config param moba_topk: 8
I0423 21:27:40.700385 139795023222592 pyconfig.py:471] Config param model_call_mode: 
I0423 21:27:40.700401 139795023222592 pyconfig.py:471] Config param model_name: gpt3-52k
I0423 21:27:40.700417 139795023222592 pyconfig.py:471] Config param moe_expert_input_dim: -1
I0423 21:27:40.700432 139795023222592 pyconfig.py:471] Config param moe_fsdp_use_two_stage_all_gather: False
I0423 21:27:40.700448 139795023222592 pyconfig.py:471] Config param moe_mlp_dim: -1
I0423 21:27:40.700463 139795023222592 pyconfig.py:471] Config param moe_mlpwi_0: RematLocation.REMAT
I0423 21:27:40.700479 139795023222592 pyconfig.py:471] Config param moe_mlpwi_1: RematLocation.REMAT
I0423 21:27:40.700493 139795023222592 pyconfig.py:471] Config param moe_mlpwo: RematLocation.REMAT
I0423 21:27:40.700509 139795023222592 pyconfig.py:471] Config param monitor_goodput: False
I0423 21:27:40.700525 139795023222592 pyconfig.py:471] Config param monitor_step_time_deviation: True
I0423 21:27:40.700538 139795023222592 pyconfig.py:471] Config param mrope_section: [24, 20, 20]
I0423 21:27:40.700555 139795023222592 pyconfig.py:471] Config param mscale: 1.0
I0423 21:27:40.700571 139795023222592 pyconfig.py:471] Config param mtc_data_parallelism: 0
I0423 21:27:40.700586 139795023222592 pyconfig.py:471] Config param mtp_eval_target_module: 0
I0423 21:27:40.700601 139795023222592 pyconfig.py:471] Config param mtp_loss_scaling_factor: 0.1
I0423 21:27:40.700616 139795023222592 pyconfig.py:471] Config param mtp_num_layers: 0
I0423 21:27:40.700631 139795023222592 pyconfig.py:471] Config param mu_dtype: float32
I0423 21:27:40.700655 139795023222592 pyconfig.py:471] Config param multi_sampling: False
I0423 21:27:40.700670 139795023222592 pyconfig.py:471] Config param multi_tier_checkpointing_backup_interval_minutes: 0
I0423 21:27:40.700685 139795023222592 pyconfig.py:471] Config param muon_beta: 0.95
I0423 21:27:40.700702 139795023222592 pyconfig.py:471] Config param muon_consistent_rms: None
I0423 21:27:40.700716 139795023222592 pyconfig.py:471] Config param muon_weight_decay: 0.0
I0423 21:27:40.700732 139795023222592 pyconfig.py:471] Config param n_routing_groups: -1
I0423 21:27:40.700748 139795023222592 pyconfig.py:471] Config param n_window_for_audio: 50
I0423 21:27:40.700762 139795023222592 pyconfig.py:471] Config param n_window_infer_for_audio: 800
I0423 21:27:40.700778 139795023222592 pyconfig.py:471] Config param nope_layer_interval: -1
I0423 21:27:40.700793 139795023222592 pyconfig.py:471] Config param norm_topk_prob: False
I0423 21:27:40.700808 139795023222592 pyconfig.py:471] Config param normalization_layer_epsilon: 1e-05
I0423 21:27:40.700825 139795023222592 pyconfig.py:471] Config param normalize_embedding_logits: False
I0423 21:27:40.700841 139795023222592 pyconfig.py:471] Config param num_attention_heads_for_vit: 16
I0423 21:27:40.700855 139795023222592 pyconfig.py:471] Config param num_batches: 4
I0423 21:27:40.700870 139795023222592 pyconfig.py:471] Config param num_channels_for_vit: 3
I0423 21:27:40.700885 139795023222592 pyconfig.py:471] Config param num_conv_layers_for_audio: 3
I0423 21:27:40.700900 139795023222592 pyconfig.py:471] Config param num_decoder_layers: 1
I0423 21:27:40.700915 139795023222592 pyconfig.py:471] Config param num_diloco_replicas: 1
I0423 21:27:40.700930 139795023222592 pyconfig.py:471] Config param num_epoch: 1
I0423 21:27:40.700944 139795023222592 pyconfig.py:471] Config param num_eval_passes: 1
I0423 21:27:40.700960 139795023222592 pyconfig.py:471] Config param num_experts: 1
I0423 21:27:40.700974 139795023222592 pyconfig.py:471] Config param num_experts_per_tok: 1
I0423 21:27:40.700989 139795023222592 pyconfig.py:471] Config param num_generations: 2
I0423 21:27:40.701004 139795023222592 pyconfig.py:471] Config param num_hidden_layers_for_vit: 34
I0423 21:27:40.701020 139795023222592 pyconfig.py:471] Config param num_iterations: 1
I0423 21:27:40.701034 139795023222592 pyconfig.py:471] Config param num_kv_heads: 2
I0423 21:27:40.701049 139795023222592 pyconfig.py:471] Config param num_layers_per_pipeline_stage: 1
I0423 21:27:40.701064 139795023222592 pyconfig.py:471] Config param num_mel_bins_for_audio: 128
I0423 21:27:40.701080 139795023222592 pyconfig.py:471] Config param num_pipeline_microbatches: -1
I0423 21:27:40.701107 139795023222592 pyconfig.py:471] Config param num_pipeline_repeats: -1
I0423 21:27:40.701122 139795023222592 pyconfig.py:471] Config param num_position_embeddings_for_vit: 1024
I0423 21:27:40.701137 139795023222592 pyconfig.py:471] Config param num_query_heads: 2
I0423 21:27:40.701152 139795023222592 pyconfig.py:471] Config param num_samplers_slices: -1
I0423 21:27:40.701167 139795023222592 pyconfig.py:471] Config param num_slices: 1
I0423 21:27:40.701182 139795023222592 pyconfig.py:471] Config param num_target_devices: 32
I0423 21:27:40.701197 139795023222592 pyconfig.py:471] Config param num_test_batches: 5
I0423 21:27:40.701212 139795023222592 pyconfig.py:471] Config param num_trainer_slices: -1
I0423 21:27:40.701226 139795023222592 pyconfig.py:471] Config param num_vocab_tiling: 1
I0423 21:27:40.701242 139795023222592 pyconfig.py:471] Config param off_policy_steps: 0
I0423 21:27:40.701257 139795023222592 pyconfig.py:471] Config param offline_data_dir: None
I0423 21:27:40.701271 139795023222592 pyconfig.py:471] Config param opt_type: OptimizerType.ADAM_PAX
I0423 21:27:40.701291 139795023222592 pyconfig.py:471] Config param optimize_mesh_for_tpu_v6e: False
I0423 21:27:40.701308 139795023222592 pyconfig.py:471] Config param optimizer_memory_host_offload: False
I0423 21:27:40.701322 139795023222592 pyconfig.py:471] Config param original_max_position_embeddings: 4096
I0423 21:27:40.701338 139795023222592 pyconfig.py:471] Config param out_hidden_size_for_vit: 512
I0423 21:27:40.701353 139795023222592 pyconfig.py:471] Config param out_proj: RematLocation.REMAT
I0423 21:27:40.701368 139795023222592 pyconfig.py:471] Config param output_dim_for_audio: 512
I0423 21:27:40.701384 139795023222592 pyconfig.py:471] Config param override_logical_axis_rules: False
I0423 21:27:40.701399 139795023222592 pyconfig.py:471] Config param override_model_config: True
I0423 21:27:40.701414 139795023222592 pyconfig.py:471] Config param packing: True
I0423 21:27:40.701430 139795023222592 pyconfig.py:471] Config param pagedattn_head_dim_alignment: 128
I0423 21:27:40.701445 139795023222592 pyconfig.py:471] Config param pagedattn_max_pages_per_group: -1
I0423 21:27:40.701461 139795023222592 pyconfig.py:471] Config param pagedattn_num_pages: 64
I0423 21:27:40.701475 139795023222592 pyconfig.py:471] Config param pagedattn_pages_per_compute_block: 4
I0423 21:27:40.701490 139795023222592 pyconfig.py:471] Config param pagedattn_tokens_per_page: 32
I0423 21:27:40.701506 139795023222592 pyconfig.py:471] Config param param_scan_axis: 1
I0423 21:27:40.701520 139795023222592 pyconfig.py:471] Config param parameter_memory_host_offload: False
I0423 21:27:40.701535 139795023222592 pyconfig.py:471] Config param partial_rotary_factor: 1.0
I0423 21:27:40.701550 139795023222592 pyconfig.py:471] Config param patch_size_for_vit: 14
I0423 21:27:40.701565 139795023222592 pyconfig.py:471] Config param penalty_incorrect_answer: -1.0
I0423 21:27:40.701579 139795023222592 pyconfig.py:471] Config param penalty_incorrect_format: -0.5
I0423 21:27:40.701596 139795023222592 pyconfig.py:471] Config param per_device_batch_size: 2
I0423 21:27:40.701610 139795023222592 pyconfig.py:471] Config param per_device_batch_size_increment: 2.0
I0423 21:27:40.701626 139795023222592 pyconfig.py:471] Config param per_device_batch_size_start: 4.0
I0423 21:27:40.701640 139795023222592 pyconfig.py:471] Config param pipeline_delay_activation_forwarding: False
I0423 21:27:40.701656 139795023222592 pyconfig.py:471] Config param pipeline_fsdp_ag_once: False
I0423 21:27:40.701670 139795023222592 pyconfig.py:471] Config param pipeline_fsdp_ag_per_repeat: False
I0423 21:27:40.701685 139795023222592 pyconfig.py:471] Config param pipeline_parallel_layers: 1
I0423 21:27:40.701701 139795023222592 pyconfig.py:471] Config param pixel_shuffle_ratio_for_vit: 0.5
I0423 21:27:40.701715 139795023222592 pyconfig.py:471] Config param posemb_type_for_vit: learn
I0423 21:27:40.701731 139795023222592 pyconfig.py:471] Config param position_id_per_seconds: 25
I0423 21:27:40.701746 139795023222592 pyconfig.py:471] Config param prefill_cache_axis_order: 1,2,0,3
I0423 21:27:40.701761 139795023222592 pyconfig.py:471] Config param prefill_cache_dir: 
I0423 21:27:40.701775 139795023222592 pyconfig.py:471] Config param prefill_chunk_size: 256
I0423 21:27:40.701791 139795023222592 pyconfig.py:471] Config param prefill_slice: v5e-16
I0423 21:27:40.701807 139795023222592 pyconfig.py:471] Config param prefix_caching_dram_byte: 100000000000
I0423 21:27:40.701822 139795023222592 pyconfig.py:471] Config param prefix_caching_hbm_byte: 10000000000
I0423 21:27:40.701837 139795023222592 pyconfig.py:471] Config param profile_cleanly: True
I0423 21:27:40.701853 139795023222592 pyconfig.py:471] Config param profile_periodically_period: -1
I0423 21:27:40.701869 139795023222592 pyconfig.py:471] Config param profile_power_events: False
I0423 21:27:40.701884 139795023222592 pyconfig.py:471] Config param profiler: ProfilerType.NONE
I0423 21:27:40.701902 139795023222592 pyconfig.py:471] Config param profiler_steps: 5
I0423 21:27:40.701918 139795023222592 pyconfig.py:471] Config param projector_dropout_for_vit: 0.0
I0423 21:27:40.701932 139795023222592 pyconfig.py:471] Config param projector_input_dim_for_vit: 4096
I0423 21:27:40.701948 139795023222592 pyconfig.py:471] Config param projector_output_dim_for_vit: 4096
I0423 21:27:40.701962 139795023222592 pyconfig.py:471] Config param prometheus_port: 0
I0423 21:27:40.701978 139795023222592 pyconfig.py:471] Config param prompt: I love to
I0423 21:27:40.701992 139795023222592 pyconfig.py:471] Config param pure_nnx: False
I0423 21:27:40.702008 139795023222592 pyconfig.py:471] Config param pure_nnx_decoder: False
I0423 21:27:40.702023 139795023222592 pyconfig.py:471] Config param q_lora_rank: 0
I0423 21:27:40.702038 139795023222592 pyconfig.py:471] Config param qk_clip_threshold: 100.0
I0423 21:27:40.702053 139795023222592 pyconfig.py:471] Config param qk_nope_head_dim: 128
I0423 21:27:40.702069 139795023222592 pyconfig.py:471] Config param qk_norm_with_scale: True
I0423 21:27:40.702083 139795023222592 pyconfig.py:471] Config param qk_rope_head_dim: 64
I0423 21:27:40.702106 139795023222592 pyconfig.py:471] Config param qkv_proj: RematLocation.REMAT
I0423 21:27:40.702123 139795023222592 pyconfig.py:471] Config param quant_cfg_path: 
I0423 21:27:40.702137 139795023222592 pyconfig.py:471] Config param quantization: QuantizationType.NONE
I0423 21:27:40.702154 139795023222592 pyconfig.py:471] Config param quantization_local_shard_count: 4
I0423 21:27:40.702170 139795023222592 pyconfig.py:471] Config param quantize_kvcache: False
I0423 21:27:40.702186 139795023222592 pyconfig.py:471] Config param query_proj: RematLocation.REMAT
I0423 21:27:40.702200 139795023222592 pyconfig.py:471] Config param query_wa_proj: RematLocation.REMAT
I0423 21:27:40.702216 139795023222592 pyconfig.py:471] Config param ragged_block_size: 256
I0423 21:27:40.702230 139795023222592 pyconfig.py:471] Config param ragged_buffer_factor: -1.0
I0423 21:27:40.702246 139795023222592 pyconfig.py:471] Config param rampup_end_step: 0
I0423 21:27:40.702261 139795023222592 pyconfig.py:471] Config param rampup_samples_per_increment_to_load: None
I0423 21:27:40.702277 139795023222592 pyconfig.py:471] Config param reasoning_end_token: </reasoning>
I0423 21:27:40.702295 139795023222592 pyconfig.py:471] Config param reasoning_start_token: <reasoning>
I0423 21:27:40.702311 139795023222592 pyconfig.py:471] Config param record_internal_nn_metrics: 0
I0423 21:27:40.702325 139795023222592 pyconfig.py:471] Config param remat_policy: full
I0423 21:27:40.702340 139795023222592 pyconfig.py:471] Config param remat_policy_for_vit: minimal
I0423 21:27:40.702355 139795023222592 pyconfig.py:471] Config param remove_size_one_mesh_axis_from_type: True
I0423 21:27:40.702370 139795023222592 pyconfig.py:471] Config param replicate_quant_scale: False
I0423 21:27:40.702384 139795023222592 pyconfig.py:471] Config param replicator_backup_interval_minutes: 0
I0423 21:27:40.702400 139795023222592 pyconfig.py:471] Config param report_heartbeat_metric_for_gcp_monitoring: False
I0423 21:27:40.702416 139795023222592 pyconfig.py:471] Config param report_performance_metric_for_gcp_monitoring: False
I0423 21:27:40.702431 139795023222592 pyconfig.py:471] Config param reshape_q: False
I0423 21:27:40.702446 139795023222592 pyconfig.py:471] Config param return_log_prob: False
I0423 21:27:40.702461 139795023222592 pyconfig.py:471] Config param reuse_example_batch: 0
I0423 21:27:40.702475 139795023222592 pyconfig.py:471] Config param reward_exact_answer: 5.0
I0423 21:27:40.702491 139795023222592 pyconfig.py:471] Config param reward_exact_format_match: 3.0
I0423 21:27:40.702507 139795023222592 pyconfig.py:471] Config param reward_partial_format_match: 0.5
I0423 21:27:40.702522 139795023222592 pyconfig.py:471] Config param reward_ratio_guess_to_answer_high: 0.5
I0423 21:27:40.702538 139795023222592 pyconfig.py:471] Config param reward_ratio_guess_to_answer_low: 0.25
I0423 21:27:40.702552 139795023222592 pyconfig.py:471] Config param reward_white_space_format_match: 1.5
I0423 21:27:40.702568 139795023222592 pyconfig.py:471] Config param rl: {'num_generations': 2, 'num_iterations': 1, 'grpo_beta': 0.08, 'grpo_epsilon': 0.2, 'loss_algo': 'grpo', 'use_agentic_rollout': False, 'max_concurrency': 256, 'off_policy_steps': 0, 'system_prompt': '', 'degenerate_group_masking': True, 'epsilon_high': None}
I0423 21:27:40.702587 139795023222592 pyconfig.py:471] Config param rollout_data_parallelism: -1
I0423 21:27:40.702603 139795023222592 pyconfig.py:471] Config param rollout_expert_parallelism: 1
I0423 21:27:40.702619 139795023222592 pyconfig.py:471] Config param rollout_micro_batch_size: -1
I0423 21:27:40.702634 139795023222592 pyconfig.py:471] Config param rollout_tensor_parallelism: -1
I0423 21:27:40.702649 139795023222592 pyconfig.py:471] Config param rope_attention_scaling: False
I0423 21:27:40.702664 139795023222592 pyconfig.py:471] Config param rope_factor: 40
I0423 21:27:40.702678 139795023222592 pyconfig.py:471] Config param rope_interleave: True
I0423 21:27:40.702694 139795023222592 pyconfig.py:471] Config param rope_linear_scaling_factor: 1.0
I0423 21:27:40.702708 139795023222592 pyconfig.py:471] Config param rope_max_timescale: 10000
I0423 21:27:40.702723 139795023222592 pyconfig.py:471] Config param rope_min_timescale: 1
I0423 21:27:40.702739 139795023222592 pyconfig.py:471] Config param rope_theta_for_vit: 10000
I0423 21:27:40.702753 139795023222592 pyconfig.py:471] Config param rope_truncate: True
I0423 21:27:40.702769 139795023222592 pyconfig.py:471] Config param rope_type: RopeType.DEFAULT
I0423 21:27:40.702786 139795023222592 pyconfig.py:471] Config param rope_use_scale: True
I0423 21:27:40.702800 139795023222592 pyconfig.py:471] Config param routed_bias: False
I0423 21:27:40.702816 139795023222592 pyconfig.py:471] Config param routed_bias_update_rate: 0.0
I0423 21:27:40.702830 139795023222592 pyconfig.py:471] Config param routed_scaling_factor: 1.0
I0423 21:27:40.702846 139795023222592 pyconfig.py:471] Config param routed_score_func: 
I0423 21:27:40.702860 139795023222592 pyconfig.py:471] Config param run_name: gpt3-52k_2026-04-23-21-27
I0423 21:27:40.702876 139795023222592 pyconfig.py:471] Config param sa_block_kv: 512
I0423 21:27:40.702890 139795023222592 pyconfig.py:471] Config param sa_block_kv_compute: 512
I0423 21:27:40.702906 139795023222592 pyconfig.py:471] Config param sa_block_kv_dkv: 512
I0423 21:27:40.702924 139795023222592 pyconfig.py:471] Config param sa_block_kv_dkv_compute: 512
I0423 21:27:40.702941 139795023222592 pyconfig.py:471] Config param sa_block_kv_dq: 512
I0423 21:27:40.702955 139795023222592 pyconfig.py:471] Config param sa_block_q: 512
I0423 21:27:40.702970 139795023222592 pyconfig.py:471] Config param sa_block_q_dkv: 512
I0423 21:27:40.702986 139795023222592 pyconfig.py:471] Config param sa_block_q_dq: 512
I0423 21:27:40.703000 139795023222592 pyconfig.py:471] Config param sa_k_layout: HEAD_DIM_MINOR
I0423 21:27:40.703014 139795023222592 pyconfig.py:471] Config param sa_q_layout: HEAD_DIM_MINOR
I0423 21:27:40.703030 139795023222592 pyconfig.py:471] Config param sa_use_fused_bwd_kernel: False
I0423 21:27:40.703044 139795023222592 pyconfig.py:471] Config param sa_v_layout: HEAD_DIM_MINOR
I0423 21:27:40.703059 139795023222592 pyconfig.py:471] Config param sampler_devices_fraction: 0.5
I0423 21:27:40.703074 139795023222592 pyconfig.py:471] Config param save_checkpoint_on_completion: True
I0423 21:27:40.703089 139795023222592 pyconfig.py:471] Config param save_config_to_gcs: False
I0423 21:27:40.703111 139795023222592 pyconfig.py:471] Config param save_quantized_params_path: 
I0423 21:27:40.703127 139795023222592 pyconfig.py:471] Config param scale_embedding_for_audio: True
I0423 21:27:40.703141 139795023222592 pyconfig.py:471] Config param scan_layers: True
I0423 21:27:40.703157 139795023222592 pyconfig.py:471] Config param scan_layers_per_stage: False
I0423 21:27:40.703171 139795023222592 pyconfig.py:471] Config param scan_pipeline_iterations: True
I0423 21:27:40.703187 139795023222592 pyconfig.py:471] Config param scan_pipeline_repeats: False
I0423 21:27:40.703201 139795023222592 pyconfig.py:471] Config param set_remat_policy_on_layers_per_stage: False
I0423 21:27:40.703216 139795023222592 pyconfig.py:471] Config param set_remat_policy_on_pipeline_iterations: True
I0423 21:27:40.703231 139795023222592 pyconfig.py:471] Config param sft_train_on_completion_only: False
I0423 21:27:40.703247 139795023222592 pyconfig.py:471] Config param shard_exp_on_fsdp: False
I0423 21:27:40.703261 139795023222592 pyconfig.py:471] Config param shard_mode: ShardMode.AUTO
I0423 21:27:40.703278 139795023222592 pyconfig.py:471] Config param shard_optimizer_over_data: False
I0423 21:27:40.703295 139795023222592 pyconfig.py:471] Config param sharding_strategy: None
I0423 21:27:40.703311 139795023222592 pyconfig.py:471] Config param sharding_tolerance: 0.02
I0423 21:27:40.703325 139795023222592 pyconfig.py:471] Config param shardy: True
I0423 21:27:40.703340 139795023222592 pyconfig.py:471] Config param share_kv_projections: False
I0423 21:27:40.703354 139795023222592 pyconfig.py:471] Config param shared_experts: 0
I0423 21:27:40.703371 139795023222592 pyconfig.py:471] Config param sinkhorn_iterations: 20
I0423 21:27:40.703385 139795023222592 pyconfig.py:471] Config param skip_first_n_steps_for_profiler: 1
I0423 21:27:40.703400 139795023222592 pyconfig.py:471] Config param skip_jax_distributed_system: False
I0423 21:27:40.703414 139795023222592 pyconfig.py:471] Config param skip_step_interval: 128
I0423 21:27:40.703430 139795023222592 pyconfig.py:471] Config param skip_step_on_spikes: False
I0423 21:27:40.703445 139795023222592 pyconfig.py:471] Config param skip_step_scaling_factor: 6.0
I0423 21:27:40.703461 139795023222592 pyconfig.py:471] Config param sliding_window_size: 0
I0423 21:27:40.703475 139795023222592 pyconfig.py:471] Config param solution_end_token: </answer>
I0423 21:27:40.703490 139795023222592 pyconfig.py:471] Config param solution_start_token: <answer>
I0423 21:27:40.703505 139795023222592 pyconfig.py:471] Config param source_checkpoint_layout: orbax
I0423 21:27:40.703520 139795023222592 pyconfig.py:471] Config param sparse_matmul: True
I0423 21:27:40.703535 139795023222592 pyconfig.py:471] Config param spatial_merge_size_for_vit: 2
I0423 21:27:40.703549 139795023222592 pyconfig.py:471] Config param stack_prefill_result_cache: False
I0423 21:27:40.703565 139795023222592 pyconfig.py:471] Config param stack_trace_interval_seconds: 600
I0423 21:27:40.703580 139795023222592 pyconfig.py:471] Config param stack_trace_to_cloud: False
I0423 21:27:40.703595 139795023222592 pyconfig.py:471] Config param step_deviation_interval_seconds: 30
I0423 21:27:40.703610 139795023222592 pyconfig.py:471] Config param steps: 200000
I0423 21:27:40.703626 139795023222592 pyconfig.py:471] Config param stop_strings: None
I0423 21:27:40.703640 139795023222592 pyconfig.py:471] Config param student_overrides: {'model_name': 'llama3.1-8b'}
I0423 21:27:40.703656 139795023222592 pyconfig.py:471] Config param student_params_to_update: None
I0423 21:27:40.703672 139795023222592 pyconfig.py:471] Config param subslice_shape: 
I0423 21:27:40.703687 139795023222592 pyconfig.py:471] Config param swap_space_vllm_gb: 2
I0423 21:27:40.703702 139795023222592 pyconfig.py:471] Config param system_prompt: 
I0423 21:27:40.703718 139795023222592 pyconfig.py:471] Config param target_eval_loss: 0.0
I0423 21:27:40.703733 139795023222592 pyconfig.py:471] Config param teacher_overrides: {'model_name': 'llama3.1-8b'}
I0423 21:27:40.703749 139795023222592 pyconfig.py:471] Config param temperature_tuning: False
I0423 21:27:40.703763 139795023222592 pyconfig.py:471] Config param temporal_patch_size_for_vit: 2
I0423 21:27:40.703778 139795023222592 pyconfig.py:471] Config param tensorboard_dir: /deps/maxtext_output/gpt3-52k_2026-04-23-21-27/tensorboard/
I0423 21:27:40.703794 139795023222592 pyconfig.py:471] Config param tensors_on_device: None
I0423 21:27:40.703810 139795023222592 pyconfig.py:471] Config param tensors_to_offload: None
I0423 21:27:40.703824 139795023222592 pyconfig.py:471] Config param test_batch_start_index: 0
I0423 21:27:40.703840 139795023222592 pyconfig.py:471] Config param tile_size_for_vit: 336
I0423 21:27:40.703854 139795023222592 pyconfig.py:471] Config param tokenize_eval_data: True
I0423 21:27:40.703870 139795023222592 pyconfig.py:471] Config param tokenize_train_data: True
I0423 21:27:40.703884 139795023222592 pyconfig.py:471] Config param tokenizer_path: meta-llama/Llama-3.1-8B
I0423 21:27:40.703899 139795023222592 pyconfig.py:471] Config param tokenizer_type: TokenizerType.HUGGINGFACE
I0423 21:27:40.703916 139795023222592 pyconfig.py:471] Config param topk_routing_group: -1
I0423 21:27:40.703931 139795023222592 pyconfig.py:471] Config param train_data_columns: ['text']
I0423 21:27:40.703948 139795023222592 pyconfig.py:471] Config param train_fraction: 1.0
I0423 21:27:40.703963 139795023222592 pyconfig.py:471] Config param train_image_column: image
I0423 21:27:40.703979 139795023222592 pyconfig.py:471] Config param train_micro_batch_size: -1
I0423 21:27:40.703995 139795023222592 pyconfig.py:471] Config param train_split: train
I0423 21:27:40.704010 139795023222592 pyconfig.py:471] Config param trainable_parameters_mask: []
I0423 21:27:40.704025 139795023222592 pyconfig.py:471] Config param trainable_position_size: 2048
I0423 21:27:40.704039 139795023222592 pyconfig.py:471] Config param trainer_devices_fraction: 0.5
I0423 21:27:40.704054 139795023222592 pyconfig.py:471] Config param upload_all_profiler_results: False
I0423 21:27:40.704069 139795023222592 pyconfig.py:471] Config param use_2d_fsdp_sharding: False
I0423 21:27:40.704086 139795023222592 pyconfig.py:471] Config param use_agentic_rollout: False
I0423 21:27:40.704115 139795023222592 pyconfig.py:471] Config param use_audio: False
I0423 21:27:40.704129 139795023222592 pyconfig.py:471] Config param use_audio_in_video: False
I0423 21:27:40.704144 139795023222592 pyconfig.py:471] Config param use_batch_split_schedule: False
I0423 21:27:40.704160 139795023222592 pyconfig.py:471] Config param use_chat_template: False
I0423 21:27:40.704176 139795023222592 pyconfig.py:471] Config param use_chunked_prefill: False
I0423 21:27:40.704190 139795023222592 pyconfig.py:471] Config param use_custom_sort_vjp: True
I0423 21:27:40.704205 139795023222592 pyconfig.py:471] Config param use_dpo: False
I0423 21:27:40.704220 139795023222592 pyconfig.py:471] Config param use_gather_mosaic_kernel: False
I0423 21:27:40.704234 139795023222592 pyconfig.py:471] Config param use_grpo: True
I0423 21:27:40.704249 139795023222592 pyconfig.py:471] Config param use_indexer: False
I0423 21:27:40.704265 139795023222592 pyconfig.py:471] Config param use_iota_embed: True
I0423 21:27:40.704281 139795023222592 pyconfig.py:471] Config param use_jax_splash: False
I0423 21:27:40.704299 139795023222592 pyconfig.py:471] Config param use_max_logit_estimate: -1
I0423 21:27:40.704314 139795023222592 pyconfig.py:471] Config param use_mrope: False
I0423 21:27:40.704329 139795023222592 pyconfig.py:471] Config param use_multimodal: False
I0423 21:27:40.704344 139795023222592 pyconfig.py:471] Config param use_nnx_pipeline: False
I0423 21:27:40.704359 139795023222592 pyconfig.py:471] Config param use_pathways: True
I0423 21:27:40.704375 139795023222592 pyconfig.py:471] Config param use_post_attn_norm: False
I0423 21:27:40.704390 139795023222592 pyconfig.py:471] Config param use_post_ffw_norm: False
I0423 21:27:40.704405 139795023222592 pyconfig.py:471] Config param use_qk_clip: False
I0423 21:27:40.704420 139795023222592 pyconfig.py:471] Config param use_qk_norm: False
I0423 21:27:40.704436 139795023222592 pyconfig.py:471] Config param use_qk_norm_in_gdn: True
I0423 21:27:40.704450 139795023222592 pyconfig.py:471] Config param use_qwix_quantization: False
I0423 21:27:40.704465 139795023222592 pyconfig.py:471] Config param use_ragged_attention: False
I0423 21:27:40.704480 139795023222592 pyconfig.py:471] Config param use_random_routing: False
I0423 21:27:40.704494 139795023222592 pyconfig.py:471] Config param use_replicator_service: False
I0423 21:27:40.704510 139795023222592 pyconfig.py:471] Config param use_ring_of_experts: False
I0423 21:27:40.704524 139795023222592 pyconfig.py:471] Config param use_sft: False
I0423 21:27:40.704541 139795023222592 pyconfig.py:471] Config param use_splash_scheduler: False
I0423 21:27:40.704555 139795023222592 pyconfig.py:471] Config param use_tokamax_gmm: False
I0423 21:27:40.704570 139795023222592 pyconfig.py:471] Config param use_tokamax_splash: False
I0423 21:27:40.704585 139795023222592 pyconfig.py:471] Config param use_truncation: True
I0423 21:27:40.704600 139795023222592 pyconfig.py:471] Config param use_tunix_gradient_accumulation: False
I0423 21:27:40.704615 139795023222592 pyconfig.py:471] Config param use_untrainable_positional_embedding: False
I0423 21:27:40.704631 139795023222592 pyconfig.py:471] Config param use_vertex_tensorboard: False
I0423 21:27:40.704645 139795023222592 pyconfig.py:471] Config param using_pipeline_parallelism: False
I0423 21:27:40.704661 139795023222592 pyconfig.py:471] Config param v_head_dim: 128
I0423 21:27:40.704676 139795023222592 pyconfig.py:471] Config param v_norm_with_scale: True
I0423 21:27:40.704693 139795023222592 pyconfig.py:471] Config param value_proj: RematLocation.REMAT
I0423 21:27:40.704708 139795023222592 pyconfig.py:471] Config param vertex_tensorboard_project: 
I0423 21:27:40.704724 139795023222592 pyconfig.py:471] Config param vertex_tensorboard_region: 
I0423 21:27:40.704738 139795023222592 pyconfig.py:471] Config param video_path: 
I0423 21:27:40.704754 139795023222592 pyconfig.py:471] Config param video_placeholder: <|video|>
I0423 21:27:40.704769 139795023222592 pyconfig.py:471] Config param vision_output_dim_for_vit: 4096
I0423 21:27:40.704784 139795023222592 pyconfig.py:471] Config param vision_output_length: -1
I0423 21:27:40.704799 139795023222592 pyconfig.py:471] Config param vllm_additional_config: {}
I0423 21:27:40.704815 139795023222592 pyconfig.py:471] Config param vllm_hf_config_path: 
I0423 21:27:40.704830 139795023222592 pyconfig.py:471] Config param vllm_hf_overrides: {}
I0423 21:27:40.704846 139795023222592 pyconfig.py:471] Config param vocab_size: 32000
I0423 21:27:40.704860 139795023222592 pyconfig.py:471] Config param warmup_steps_fraction: 0.1
I0423 21:27:40.704876 139795023222592 pyconfig.py:471] Config param weight_dtype: float32
I0423 21:27:40.704901 139795023222592 pyconfig.py:471] Config param weight_quantization_calibration_method: absmax
I0423 21:27:40.704916 139795023222592 pyconfig.py:471] Config param wi_tile_dlhs_batch_seq: 512
I0423 21:27:40.704932 139795023222592 pyconfig.py:471] Config param wi_tile_dlhs_embed_dim: 1024
I0423 21:27:40.704947 139795023222592 pyconfig.py:471] Config param wi_tile_dlhs_mlp_dim: 1024
I0423 21:27:40.704962 139795023222592 pyconfig.py:471] Config param wi_tile_drhs_batch_seq: 512
I0423 21:27:40.704978 139795023222592 pyconfig.py:471] Config param wi_tile_drhs_embed_dim: 1024
I0423 21:27:40.704995 139795023222592 pyconfig.py:471] Config param wi_tile_drhs_mlp_dim: 1024
I0423 21:27:40.705010 139795023222592 pyconfig.py:471] Config param wi_tile_fwd_batch_seq: 512
I0423 21:27:40.705024 139795023222592 pyconfig.py:471] Config param wi_tile_fwd_embed_dim: 1024
I0423 21:27:40.705039 139795023222592 pyconfig.py:471] Config param wi_tile_fwd_mlp_dim: 1024
I0423 21:27:40.705054 139795023222592 pyconfig.py:471] Config param wo_tile_dlhs_batch_seq: 512
I0423 21:27:40.705068 139795023222592 pyconfig.py:471] Config param wo_tile_dlhs_embed_dim: 1024
I0423 21:27:40.705084 139795023222592 pyconfig.py:471] Config param wo_tile_dlhs_mlp_dim: 1024
I0423 21:27:40.705113 139795023222592 pyconfig.py:471] Config param wo_tile_drhs_batch_seq: 512
I0423 21:27:40.705127 139795023222592 pyconfig.py:471] Config param wo_tile_drhs_embed_dim: 1024
I0423 21:27:40.705142 139795023222592 pyconfig.py:471] Config param wo_tile_drhs_mlp_dim: 1024
I0423 21:27:40.705157 139795023222592 pyconfig.py:471] Config param wo_tile_fwd_batch_seq: 512
I0423 21:27:40.705172 139795023222592 pyconfig.py:471] Config param wo_tile_fwd_embed_dim: 1024
I0423 21:27:40.705187 139795023222592 pyconfig.py:471] Config param wo_tile_fwd_mlp_dim: 1024
I0423 21:27:40.705202 139795023222592 pyconfig.py:471] Config param wsd_decay_steps_fraction: 0.1
I0423 21:27:40.705218 139795023222592 pyconfig.py:471] Config param wsd_decay_style: WsdDecayStyle.LINEAR
I0423 21:27:40.705236 139795023222592 pyconfig.py:471] Config param xprof_e2e_enable_fw_power_level_event: False
I0423 21:27:40.705252 139795023222592 pyconfig.py:471] Config param xprof_e2e_enable_fw_thermal_event: False
I0423 21:27:40.705267 139795023222592 pyconfig.py:471] Config param xprof_e2e_enable_fw_throttle_event: False
I0423 21:27:40.705282 139795023222592 pyconfig.py:471] Config param xprof_tpu_power_trace_level: 0
I0423 21:27:40.705302 139795023222592 pyconfig.py:471] Config param z_loss_multiplier: 0.0
I0423 21:27:40.705614 139795023222592 tokenizer.py:245] Tokenizer path: meta-llama/Llama-2-7b-chat-hf
I0423 21:27:40.705650 139795023222592 tokenizer.py:224] Loading HF tokenizer: meta-llama/Llama-2-7b-chat-hf
I0423 21:27:44.337869 139795023222592 _schedule.py:129] A polynomial schedule was set with a non-positive `transition_steps` value; this results in a constant schedule with value `init_value`.
I0423 21:27:44.340978 139795023222592 maxtext_utils.py:1565] Num_devices: 32, shape (1, 4, 1, 8, 1, 1, 1, 1, 1, 1, 1, 1, 1)
I0423 21:27:44.341133 139795023222592 train_distill.py:596] Applying logical axis rules for model initialization and training...
I0423 21:27:44.341216 139795023222592 train_distill.py:600] Loading Student from ...
I0423 21:27:44.341248 139795023222592 train_distill.py:169] --- Student Configuration ---
I0423 21:27:44.341273 139795023222592 train_distill.py:170]   Model Name:      gpt3-52k
I0423 21:27:44.341297 139795023222592 train_distill.py:171]   Dimensions:      1 Layers, 16 Emb Dim, 8 Head Dim
I0423 21:27:44.341321 139795023222592 train_distill.py:174]   Attention Heads: 2 Query, 2 KV
I0423 21:27:44.341344 139795023222592 train_distill.py:175]   Vocab Size:      32000
I0423 21:27:44.341365 139795023222592 train_distill.py:176]   Checkpoint:      
I0423 21:27:44.341386 139795023222592 train_distill.py:465] Initializing model: gpt3-52k...
I0423 21:27:45.747084 139795023222592 train_distill.py:614] Loading Teacher from gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items...
I0423 21:27:45.747214 139795023222592 train_distill.py:169] --- Teacher Configuration ---
I0423 21:27:45.747244 139795023222592 train_distill.py:170]   Model Name:      gpt3-52k
I0423 21:27:45.747269 139795023222592 train_distill.py:171]   Dimensions:      1 Layers, 16 Emb Dim, 8 Head Dim
I0423 21:27:45.747291 139795023222592 train_distill.py:174]   Attention Heads: 2 Query, 2 KV
I0423 21:27:45.747308 139795023222592 train_distill.py:175]   Vocab Size:      32000
I0423 21:27:45.747328 139795023222592 train_distill.py:176]   Checkpoint:      gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items
I0423 21:27:45.747346 139795023222592 train_distill.py:465] Initializing model: gpt3-52k...
I0423 21:27:46.829073 139795023222592 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0423 21:27:46.829541 139795023222592 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=True, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7f23d9161850>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0423 21:27:46.829599 139795023222592 abstract_checkpointer.py:35] orbax-checkpoint version: 0.11.28
W0423 21:27:47.765940 139795023222592 checkpoint.py:202] Metadata file does not exist: gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items/_CHECKPOINT_METADATA
I0423 21:27:48.281657    2089 google_auth_provider.cc:181] Running on GCE, using service account 562977990677-compute@developer.gserviceaccount.com
I0423 21:27:49.434650 139795023222592 checkpointer.py:304] Restoring checkpoint from gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items.
W0423 21:27:51.698165 139795023222592 transform_utils.py:230] The transformations API will eventually be replaced by an upgraded design. The current API will not be removed until this point, but it will no longer be actively worked on.
I0423 21:27:51.698543 139795023222592 transform_utils.py:288] The following keys are not loaded from the original tree after applying specified transforms: params/params/decoder/to_nnx__rngs/aqt/count, params/params/decoder/to_nnx__rngs/aqt/key, params/params/decoder/to_nnx__rngs/dropout/count, params/params/decoder/to_nnx__rngs/dropout/key, params/params/decoder/to_nnx__rngs/params/count, params/params/decoder/to_nnx__rngs/params/key
I0423 21:27:52.328823 139795023222592 checkpointer.py:318] Finished restoring checkpoint in 3.27 seconds from gs://lance-maxtext/pt_seed_ckpts/pt_seed_ckpts/pt_seed_ckpt_gpt352k_v32k_linen/checkpoints/4/items.
I0423 21:27:53.014544 139795023222592 train_distill.py:640] Initializing Data Iterators via MaxText pipeline...
I0423 21:27:53.077692 139795023222592 config.py:112] TensorFlow version 2.20.0 available.
I0423 21:27:53.078273 139795023222592 config.py:125] JAX version 0.8.3 available.
E0423 21:27:55.162345 139795023222592 packing.py:209] PackAndBatchOperation is deprecated. Please use lazy_dataset.FirstFitPackIterDataset instead.
I0423 21:27:55.162578 139795023222592 data_loader.py:408] Adding CopyNumPyArrayToSharedMemory MapTransform.
I0423 21:27:55.165670 139795023222592 train_distill.py:410] Input Pipeline Checkpointing: DISABLED
I0423 21:27:55.165733 139795023222592 train_distill.py:414] Reason: Iterator 'MultiHostDataLoadIterator' is not recognized as Grain (dataset_type='DatasetType.HF', has_save=False)
I0423 21:27:55.165794 139795023222592 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0423 21:27:55.165871 139795023222592 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7f23d9161850>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0423 21:27:55.165911 139795023222592 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0423 21:27:55.165942 139795023222592 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7f23d9161850>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0423 21:27:55.165986 139795023222592 checkpoint_manager.py:702] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=None, item_handlers={'model_params': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f19cd698f50>, 'optimizer_state': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f1c2c0ba930>, 'custom_metadata': <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3be30>}, handler_registry=None
I0423 21:27:55.166191 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "model_params". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f19cd698f50>` for item "model_params" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0423 21:27:55.166235 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "optimizer_state". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f1c2c0ba930>` for item "optimizer_state" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0423 21:27:55.166261 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "custom_metadata". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3be30>` for item "custom_metadata" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0423 21:27:55.166285 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3bb30>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0423 21:27:55.166312 139795023222592 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f19cd698f50>, ('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f19cd698f50>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f1c2c0ba930>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f1c2c0ba930>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3be30>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3be30>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3bb30>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51a3bb30>}).
I0423 21:27:55.166723 139795023222592 async_checkpointer.py:177] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7f0d5189b1a0> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0423 21:27:58.478428 139795023222592 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://lance-maxtext/pt_ckpt_xpk_test_pipeline_scan_nnx_20260423_211626/pt_distill_linen_xpk_test_pipeline_scan_nnx_20260423_211626_07_distill_smoke/checkpoints
I0423 21:27:58.484536 139795023222592 checkpoint_manager.py:921] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=2000, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=None, preservation_policy=None, prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://lance-maxtext/pt_ckpt_xpk_test_pipeline_scan_nnx_20260423_211626/pt_distill_linen_xpk_test_pipeline_scan_nnx_20260423_211626_07_distill_smoke/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7f0d51a3be00>
I0423 21:27:58.484651 139795023222592 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0423 21:27:58.484715 139795023222592 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7f23d9161850>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0423 21:27:58.484750 139795023222592 pytree_checkpoint_handler.py:577] save_device_host_concurrent_bytes=None
I0423 21:27:58.484781 139795023222592 base_pytree_checkpoint_handler.py:411] Created BasePyTreeCheckpointHandler: use_ocdbt=True, use_zarr3=False, pytree_metadata_options=PyTreeMetadataOptions(support_rich_types=False), array_metadata_store=<orbax.checkpoint._src.metadata.array_metadata_store.Store object at 0x7f23d9161850>, enable_pinned_host_transfer=False, save_concurrent_bytes: 96000000000 (89.4 GiB), restore_concurrent_bytes: 96000000000 (89.4 GiB)
I0423 21:27:58.484815 139795023222592 checkpoint_manager.py:1983] [process=6][thread=MainThread][wait_until_finished] No Save Finalize thread to wait for. Returning.
I0423 21:27:58.484865 139795023222592 checkpoint.py:459] Closing _NonBlockingMetadataStore(enable_write=True, _write_lock=<locked _thread.RLock object owner=139795023222592 count=1 at 0x7f1c882a2740>, _store_impl=<orbax.checkpoint._src.metadata.checkpoint._MetadataStoreImpl object at 0x7f0d51a3bc50>, _single_thread_executor=<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f0d51a3bc20>, _write_futures=[])
I0423 21:27:58.485232 139795023222592 checkpoint.py:459] Closing _NonBlockingMetadataStore(enable_write=True, _write_lock=<locked _thread.RLock object owner=139795023222592 count=1 at 0x7f1c882a2740>, _store_impl=<orbax.checkpoint._src.metadata.checkpoint._MetadataStoreImpl object at 0x7f0d51a3bc50>, _single_thread_executor=<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f0d51a3bc20>, _write_futures=[])
I0423 21:27:58.485260 139795023222592 checkpoint.py:459] Closing _NonBlockingMetadataStore(enable_write=True, _write_lock=<locked _thread.RLock object owner=139795023222592 count=1 at 0x7f1c882a2740>, _store_impl=<orbax.checkpoint._src.metadata.checkpoint._MetadataStoreImpl object at 0x7f0d51a3bc50>, _single_thread_executor=<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f0d51a3bc20>, _write_futures=[])
I0423 21:27:58.485291 139795023222592 checkpoint_manager.py:702] [process=6][thread=MainThread] CheckpointManager init: checkpointers=None, item_names=None, item_handlers={'model_params': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51a3bdd0>, 'optimizer_state': <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51916b70>, 'custom_metadata': <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51916ae0>, 'iter': <maxtext.common.checkpointing.GrainCheckpointHandler object at 0x7f0d519164e0>}, handler_registry=None
I0423 21:27:58.485389 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "model_params". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51a3bdd0>` for item "model_params" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0423 21:27:58.485423 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "optimizer_state". Adding handler `<orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51916b70>` for item "optimizer_state" and save args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>` to `_handler_registry`.
I0423 21:27:58.485445 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "custom_metadata". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51916ae0>` for item "custom_metadata" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0423 21:27:58.485472 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "iter". Adding handler `<maxtext.common.checkpointing.GrainCheckpointHandler object at 0x7f0d519164e0>` for item "iter" and save args `<class 'maxtext.common.checkpointing.GrainCheckpointSave'>` and restore args `<class 'maxtext.common.checkpointing.GrainCheckpointRestore'>` to `_handler_registry`.
I0423 21:27:58.485495 139795023222592 composite_checkpoint_handler.py:237] Deferred registration for item: "metrics". Adding handler `<orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d519167e0>` for item "metrics" and save args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>` and restore args `<class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>` to `_handler_registry`.
I0423 21:27:58.485523 139795023222592 composite_checkpoint_handler.py:505] Initialized registry DefaultCheckpointHandlerRegistry({('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51a3bdd0>, ('model_params', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51a3bdd0>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeSaveArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51916b70>, ('optimizer_state', <class 'orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeRestoreArgs'>): <orbax.checkpoint._src.handlers.pytree_checkpoint_handler.PyTreeCheckpointHandler object at 0x7f0d51916b70>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51916ae0>, ('custom_metadata', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d51916ae0>, ('iter', <class 'maxtext.common.checkpointing.GrainCheckpointSave'>): <maxtext.common.checkpointing.GrainCheckpointHandler object at 0x7f0d519164e0>, ('iter', <class 'maxtext.common.checkpointing.GrainCheckpointRestore'>): <maxtext.common.checkpointing.GrainCheckpointHandler object at 0x7f0d519164e0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonSaveArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d519167e0>, ('metrics', <class 'orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonRestoreArgs'>): <orbax.checkpoint._src.handlers.json_checkpoint_handler.JsonCheckpointHandler object at 0x7f0d519167e0>}).
I0423 21:27:58.485593 139795023222592 async_checkpointer.py:177] [process=6][thread=MainThread] Using barrier_sync_fn: <function get_barrier_sync_fn.<locals>._fn at 0x7f0d5189b2e0> timeout: 600 secs and primary_host=0 for async checkpoint writes
I0423 21:27:58.864593 139795023222592 checkpoint_manager.py:1788] Found 0 checkpoint steps in gs://lance-maxtext/pt_ckpt_xpk_test_pipeline_scan_nnx_20260423_211626/pt_distill_linen_xpk_test_pipeline_scan_nnx_20260423_211626_07_distill_smoke/checkpoints
I0423 21:27:59.303068 139795023222592 checkpoint_manager.py:921] [process=6][thread=MainThread] CheckpointManager created,  primary_host=0, CheckpointManagerOptions=CheckpointManagerOptions(save_interval_steps=2000, max_to_keep=None, keep_time_interval=None, keep_period=None, should_keep_fn=None, best_fn=None, best_mode='max', keep_checkpoints_without_metrics=True, step_prefix=None, step_format_fixed_length=None, step_name_format=None, create=True, cleanup_tmp_directories=False, save_on_steps=frozenset(), single_host_load_and_broadcast=False, todelete_subdir=None, todelete_full_path=None, enable_hns=False, enable_background_delete=False, read_only=False, enable_async_checkpointing=True, async_options=None, multiprocessing_options=MultiprocessingOptions(primary_host=0, active_processes=None, barrier_sync_key_prefix=None), should_save_fn=None, file_options=FileOptions(path_permission_mode=None), save_root_metadata=True, temporary_path_class=None, save_decision_policy=None, preservation_policy=None, prevent_write_metrics=False, enable_should_save_is_saving_in_progress_check=True, enable_per_process_directory_creation=False), root_directory=gs://lance-maxtext/pt_ckpt_xpk_test_pipeline_scan_nnx_20260423_211626/pt_distill_linen_xpk_test_pipeline_scan_nnx_20260423_211626_07_distill_smoke/checkpoints: <orbax.checkpoint.checkpoint_manager.CheckpointManager object at 0x7f1c88279e80>
I0423 21:27:59.303677 139795023222592 train_distill.py:691] Starting Distillation Training...
I0423 21:27:59.303786 139795023222592 peft_trainer.py:590] Training with mesh: Mesh('diloco': 1, 'data': 4, 'stage': 1, 'fsdp': 8, 'fsdp_transpose': 1, 'sequence': 1, 'context': 1, 'context_autoregressive': 1, 'tensor': 1, 'tensor_transpose': 1, 'tensor_sequence': 1, 'expert': 1, 'autoregressive': 1, axis_types=(Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto, Auto))
I0423 21:27:59.656548 139795023222592 peft_trainer.py:600] Compiled train_step cache size: 0

Training:   0%|          | 0/5 [00:00<?, ?step/s]I0423 21:27:59.658350 139654208677632 grain_pool.py:367] Grain pool will use 1 processes.
I0423 21:28:00.061686 139654208677632 grain_pool.py:440] Grain pool will start child processes.
I0423 21:28:00.067001 139654208677632 grain_pool.py:448] Grain pool started all child processes.
2026-04-23 21:28:06.109800: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
`rope_scaling`'s factor field must be a float >= 1, got 40
`rope_scaling`'s beta_fast field must be a float, got 32
`rope_scaling`'s beta_slow field must be a float, got 1
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
Unrecognized keys in `rope_scaling` for 'rope_type'='yarn': {'rope_theta'}
I0423 21:28:09.471296 139795023222592 utils.py:86] Train loop finished in: 9.8142 seconds
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/deps/src/maxtext/trainers/post_train/distillation/train_distill.py", line 765, in <module>
    app.run(main)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 316, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 261, in _run_main
    sys.exit(main(argv))
             ^^^^^^^^^^
  File "/deps/src/maxtext/trainers/post_train/distillation/train_distill.py", line 761, in main
    train_distill(student_config, teacher_config, is_offline, global_config.offline_data_dir)
  File "/deps/src/maxtext/trainers/post_train/distillation/train_distill.py", line 693, in train_distill
    trainer.train(train_iter, eval_iter)
  File "/usr/local/lib/python3.12/site-packages/tunix/sft/peft_trainer.py", line 659, in train
    train_example = sharding_utils.shard_input(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tunix/sft/sharding_utils.py", line 58, in shard_input
    return jax.tree.map(
           ^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/tree.py", line 155, in map
    return tree_util.tree_map(f, tree, *rest, is_leaf=is_leaf)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/tree_util.py", line 362, in tree_map
    return treedef.unflatten(f(*xs) for xs in zip(*all_leaves))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/tree_util.py", line 362, in <genexpr>
    return treedef.unflatten(f(*xs) for xs in zip(*all_leaves))
                             ^^^^^^
  File "/usr/local/lib/python3.12/site-packages/tunix/sft/sharding_utils.py", line 59, in <lambda>
    lambda x: jax.make_array_from_process_local_data(
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/array.py", line 986, in make_array_from_process_local_data
    out = [_array_from_process_local_data(data, s, shape)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/array.py", line 1048, in _array_from_process_local_data
    return make_array_from_callback(global_shape, sharding, cb)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/array.py", line 845, in make_array_from_callback
    per_device_values = api.device_put(per_device_values, devices)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/api.py", line 2729, in device_put
    out_flat = dispatch._batched_device_put_impl(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/dispatch.py", line 558, in _batched_device_put_impl
    y = _device_put_impl(x, device=device, src=src, copy=cp, aval=aval)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/dispatch.py", line 545, in _device_put_impl
    return _device_put_sharding_impl(x, aval, device, copy)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/jax/_src/dispatch.py", line 487, in _device_put_sharding_impl
    raise ValueError(
ValueError: device_put's first argument must be a fully addressable array, but got value with devices {TpuDevice(id=13, process_index=2, coords=(1,3,0), core_on_chip=0), TpuDevice(id=3, process_index=1, coords=(3,0,0), core_on_chip=0), TpuDevice(id=2, process_index=1, coords=(2,0,0), core_on_chip=0), TpuDevice(id=22, process_index=5, coords=(2,5,0), core_on_chip=0), TpuDevice(id=6, process_index=1, coords=(2,1,0), core_on_chip=0), TpuDevice(id=0, process_index=0, coords=(0,0,0), core_on_chip=0), TpuDevice(id=9, process_index=2, coords=(1,2,0), core_on_chip=0), TpuDevice(id=1, process_index=0, coords=(1,0,0), core_on_chip=0), TpuDevice(id=18, process_index=5, coords=(2,4,0), core_on_chip=0), TpuDevice(id=24, process_index=6, coords=(0,6,0), core_on_chip=0), TpuDevice(id=4, process_index=0, coords=(0,1,0), core_on_chip=0), TpuDevice(id=15, process_index=3, coords=(3,3,0), core_on_chip=0), TpuDevice(id=27, process_index=7, coords=(3,6,0), core_on_chip=0), TpuDevice(id=11, process_index=3, coords=(3,2,0), core_on_chip=0), TpuDevice(id=25, process_index=6, coords=(1,6,0), core_on_chip=0), TpuDevice(id=21, process_index=4, coords=(1,5,0), core_on_chip=0), TpuDevice(id=7, process_index=1, coords=(3,1,0), core_on_chip=0), TpuDevice(id=30, process_index=7, coords=(2,7,0), core_on_chip=0), TpuDevice(id=14, process_index=3, coords=(2,3,0), core_on_chip=0), TpuDevice(id=16, process_index=4, coords=(0,4,0), core_on_chip=0), TpuDevice(id=31, process_index=7, coords=(3,7,0), core_on_chip=0), TpuDevice(id=28, process_index=6, coords=(0,7,0), core_on_chip=0), TpuDevice(id=8, process_index=2, coords=(0,2,0), core_on_chip=0), TpuDevice(id=12, process_index=2, coords=(0,3,0), core_on_chip=0), TpuDevice(id=5, process_index=0, coords=(1,1,0), core_on_chip=0), TpuDevice(id=26, process_index=7, coords=(2,6,0), core_on_chip=0), TpuDevice(id=19, process_index=5, coords=(3,4,0), core_on_chip=0), TpuDevice(id=20, process_index=4, coords=(0,5,0), core_on_chip=0), TpuDevice(id=29, process_index=6, coords=(1,7,0), core_on_chip=0), TpuDevice(id=10, process_index=3, coords=(2,2,0), core_on_chip=0), TpuDevice(id=23, process_index=5, coords=(3,5,0), core_on_chip=0), TpuDevice(id=17, process_index=4, coords=(1,4,0), core_on_chip=0)}
I0423 21:28:09.814153 139654208677632 grain_pool.py:542] Grain pool is exiting.
I0423 21:28:09.814255 139654208677632 grain_pool.py:547] Shutting down multiprocessing system.
I0423 21:28:11.270135 139654208677632 grain_pool.py:547] Shutting down multiprocessing system.

Training:   0%|          | 0/5 [00:13<?, ?step/s]
/usr/local/lib/python3.12/multiprocessing/resource_tracker.py:279: UserWarning: resource_tracker: There appear to be 15 leaked shared_memory objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '
XPK End: Thu Apr 23 21:28:18 UTC 2026
EXIT_CODE=1