MaxView

← Back to run

Log Summary

XPK Start: Sun Apr 19 10:49:24 UTC 2026
2026-04-19 10:49:28.780494: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1776595768.792925       9 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1776595768.796590       9 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1776595768.807596       9 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776595768.807615       9 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776595768.807617       9 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1776595768.807619       9 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2026-04-19 10:49:47.920750: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
I0419 10:49:48.432175 134498386511680 max_utils.py:273] Attempting to initialize the jax distributed system...
INFO:2026-04-19 10:49:57,473:jax._src.distributed:157: Connecting to JAX distributed service on mt-10-shardy-true-3ybw4-slice-job-0-0.mt-10-shardy-true-3ybw4:8482
I0419 10:49:57.473978 134498386511680 distributed.py:157] Connecting to JAX distributed service on mt-10-shardy-true-3ybw4-slice-job-0-0.mt-10-shardy-true-3ybw4:8482
I0419 10:50:00.163868 134498386511680 max_utils.py:284] Jax distributed system initialized!
I0419 10:50:07.513018 134498386511680 max_utils.py:800] System Information: Jax Version: 0.8.1
I0419 10:50:07.513144 134498386511680 max_utils.py:801] System Information: Jaxlib Version: 0.8.1
I0419 10:50:07.513185 134498386511680 max_utils.py:802] System Information: Jax Backend: PJRT C API
TFRT TPU v6 lite
Built on Nov 12 2025 14:16:36 (1762985796) cl/831091709
I0419 10:50:07.513222 134498386511680 train_utils.py:347] WARNING: Sequence packing is essentially ignored for synthetic data. Please use a real dataset to use sequence packing.
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 727, in <module>
    app.run(main)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 316, in run
    _run_main(main, args)
  File "/usr/local/lib/python3.12/site-packages/absl/app.py", line 261, in _run_main
    sys.exit(main(argv))
             ^^^^^^^^^^
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 723, in main
    train_func()
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 713, in train_func
    run(config, recorder, diagnostic_config)
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 692, in run
    train_loop(config, recorder)
  File "/deps/src/maxtext/trainers/pre_train/train.py", line 517, in train_loop
    ) = train_utils.setup_train_loop(config, recorder)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/deps/src/maxtext/utils/train_utils.py", line 217, in setup_train_loop
    raise NotImplementedError("Pure NNX support has not been implemented yet.")
NotImplementedError: Pure NNX support has not been implemented yet.
XPK End: Sun Apr 19 10:50:14 UTC 2026
EXIT_CODE=1