| # Attributions & Licenses | |
| ## ProBERT | |
| Copyright Β© 2026 Alex Kwon (Collapse Index Labs) | |
| Licensed under Collapse Index Open Model License v1.0 (see LICENSE.md) | |
| ## DistilBERT | |
| **ProBERT is derived from DistilBERT**, a transformer model developed by Hugging Face. | |
| - **Project**: [DistilBERT](https://github.com/huggingface/transformers) | |
| - **Author**: Hugging Face Team | |
| - **License**: Apache 2.0 | |
| - **Citation**: | |
| ``` | |
| Sanh, V., Debut, L., Ehrhardt, J., & Uniform, D. (2019). | |
| DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. | |
| arXiv preprint arXiv:1910.01108. | |
| ``` | |
| ### DistilBERT Apache 2.0 License (Summary) | |
| DistilBERT is distributed under the Apache 2.0 license, which permits: | |
| - β Commercial use | |
| - β Modification | |
| - β Distribution | |
| - β Private use | |
| - β οΈ Requires: Attribution, license notice, documentation of changes | |
| Full Apache 2.0 text: https://www.apache.org/licenses/LICENSE-2.0 | |
| --- | |
| ## Dependencies | |
| ProBERT uses the following libraries, all compatible with Apache 2.0: | |
| - **transformers** (Hugging Face) β Apache 2.0 | |
| - **torch** (PyTorch) β BSD | |
| - **numpy** (NumPy Foundation) β BSD | |