Protenij — JAX/Equinox weights for Protenix

This repository hosts JAX/Equinox-converted model weights (and a mirror of the original PyTorch protenix-v2 checkpoint) for use with protenij, a JAX/Equinox translation of Protenix, ByteDance's implementation of the AlphaFold 3 architecture.

The JAX/Equinox weights are format conversions of the original PyTorch checkpoints released by ByteDance — the underlying model parameters are numerically identical, only the serialization format has changed (PyTorch .pt → Equinox .eqx + pickled skeleton).

Files

File Format Size Source
protenix-v2.eqx / protenix-v2.skeleton.pkl Equinox 1.86 GB Converted from protenix-v2.pt
protenix-v2.pt PyTorch 1.86 GB Mirror of upstream ByteDance release
protenix_base_default_v1.0.0.eqx / .skeleton.pkl Equinox Converted from upstream
protenix_base_20250630_v1.0.0.eqx / .skeleton.pkl Equinox Converted from upstream
protenix_mini_default_v0.5.0.eqx / .skeleton.pkl Equinox Converted from upstream
protenix_tiny_default_v0.5.0.eqx / .skeleton.pkl Equinox Converted from upstream
components.v20240608.cif Data CCD chemical components (upstream)
components.v20240608.cif.rdkit_mol.pkl Data CCD rdkit mol cache (upstream)
clusters-by-entity-40.txt Data PDB entity-40 clusters (upstream)

Usage

from protenix.backend import load_model
model = load_model("protenix-v2")  # auto-downloads from this repo

License and attribution

Released under the Apache License 2.0, matching the upstream bytedance/Protenix project.

The upstream Protenix README explicitly states:

"The Protenix project including both code and model parameters is released under the Apache 2.0 License. It is free for both academic research and commercial use."

Modification notice (Apache 2.0 §4(b))

The .eqx and .skeleton.pkl files in this repository are format conversions of the original PyTorch checkpoints released by ByteDance. The PyTorch state dicts were loaded and the tensors re-serialized in Equinox format using protenix/backend.py and translate_models.py. No weights were retrained, fine-tuned, or otherwise numerically modified.

The protenix-v2.pt file in this repository is a bit-for-bit mirror of the original PyTorch checkpoint hosted at https://protenix.tos-cn-beijing.volces.com/checkpoint/protenix-v2.pt (mirrored here after the upstream URL became unreachable).

Copyright notice (Apache 2.0 §4(c))

Copyright 2024 ByteDance and/or its affiliates. The original Protenix code and model parameters were released under Apache License 2.0. See the LICENSE file in this repository for the full license text.

Citations

If you use these weights, please cite the original Protenix work:

Disclaimer

These files are provided as-is. The weights are format conversions only — for the authoritative source and for training code, model cards, and technical reports, refer to the upstream ByteDance Protenix repository.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support