OTel-LLM-12B-Safety
OTel-LLM-12B-Safety is a telecom-specialized language model fine-tuned on telecommunications domain data. It is part of the OTel Family of Models, an open-source initiative to build industry-standard AI models for the global telecommunications sector.
Model Details
| Attribute | Value |
|---|---|
| Base Model | google/gemma-3-12b-it |
| Parameters | 12B |
| Training Method | Full parameter fine-tuning |
| Language | English |
| License | Apache 2.0 |
Training Data
The model was trained on high-quality telecom-focused data curated by 100+ domain experts from organizations including AT&T, Microsoft, AMD, GSMA, RelationalAI, Essential AI, Purdue University, Khalifa University, University of Leeds, Yale University, The University of Texas at Dallas, NetoAI, and MantisNLP.
Data Sources:
- GSMA Permanent Reference Documents
- 3GPP Specifications
- O-RAN Documentation
- RFC Series
- eSIM, terminals, security, networks, roaming, APIs
- Industry whitepapers and telecom academic papers
Intended Use
The OTel model family is designed to power end-to-end Retrieval-Augmented Generation (RAG) pipelines for telecommunications. The three model types serve complementary roles:
- Embedding — Retrieve relevant chunks from telecom specifications, standards, and documentation.
- Reranker — Re-score and prioritize the retrieved chunks for relevance.
- LLM — Generate accurate responses grounded in the retrieved context.
Users can deploy the full pipeline or use individual models independently based on their needs.
Note: The LLMs include abstention training — if the model does not receive sufficient context, it will decline to answer rather than hallucinate. This means the models are optimized for context-grounded generation, not open-ended question answering.
Related Models
Language Models
Embedding Models
Reranker Models
Related Datasets
Training Infrastructure
- Framework: ScalarLM (GPU-agnostic)
- Compute: AMD and NVIDIA GPUs.
Citation
@misc{otel2026,
title={OTel: Open Telco AI Models},
author={Tavakkoli, Farbod and Diamos, Gregory and Paulk, Roderic and Terrazas, Jorden},
year={2026},
url={https://huggingface.co/farbodtavakkoli}
}
Contact
If you have any technical questions, please feel free to reach out to farbod.tavakkoli@att.com or farbodtavakoli@gmail.com
- Downloads last month
- 265