Attributions & Licenses
ProBERT
Copyright © 2026 Alex Kwon (Collapse Index Labs)
Licensed under Collapse Index Open Model License v1.0 (see LICENSE.md)
DistilBERT
ProBERT is derived from DistilBERT, a transformer model developed by Hugging Face.
- Project: DistilBERT
- Author: Hugging Face Team
- License: Apache 2.0
- Citation:
Sanh, V., Debut, L., Ehrhardt, J., & Uniform, D. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.
DistilBERT Apache 2.0 License (Summary)
DistilBERT is distributed under the Apache 2.0 license, which permits:
- ✅ Commercial use
- ✅ Modification
- ✅ Distribution
- ✅ Private use
- ⚠️ Requires: Attribution, license notice, documentation of changes
Full Apache 2.0 text: https://www.apache.org/licenses/LICENSE-2.0
Dependencies
ProBERT uses the following libraries, all compatible with Apache 2.0:
- transformers (Hugging Face) — Apache 2.0
- torch (PyTorch) — BSD
- numpy (NumPy Foundation) — BSD