File size: 1,188 Bytes
14a0518 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 | # Attributions & Licenses
## ProBERT
Copyright © 2026 Alex Kwon (Collapse Index Labs)
Licensed under Collapse Index Open Model License v1.0 (see LICENSE.md)
## DistilBERT
**ProBERT is derived from DistilBERT**, a transformer model developed by Hugging Face.
- **Project**: [DistilBERT](https://github.com/huggingface/transformers)
- **Author**: Hugging Face Team
- **License**: Apache 2.0
- **Citation**:
```
Sanh, V., Debut, L., Ehrhardt, J., & Uniform, D. (2019).
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter.
arXiv preprint arXiv:1910.01108.
```
### DistilBERT Apache 2.0 License (Summary)
DistilBERT is distributed under the Apache 2.0 license, which permits:
- ✅ Commercial use
- ✅ Modification
- ✅ Distribution
- ✅ Private use
- ⚠️ Requires: Attribution, license notice, documentation of changes
Full Apache 2.0 text: https://www.apache.org/licenses/LICENSE-2.0
---
## Dependencies
ProBERT uses the following libraries, all compatible with Apache 2.0:
- **transformers** (Hugging Face) — Apache 2.0
- **torch** (PyTorch) — BSD
- **numpy** (NumPy Foundation) — BSD
|