Text Classification
Transformers
Safetensors
English
bert
fill-mask
BERT
NeuroBERT
transformer
pre-training
nlp
tiny-bert
edge-ai
low-resource
micro-nlp
quantized
iot
wearable-ai
offline-assistant
intent-detection
real-time
smart-home
embedded-systems
command-classification
toy-robotics
voice-ai
eco-ai
english
lightweight
mobile-nlp
ner
Update README.md
Browse files
README.md
CHANGED
|
@@ -46,7 +46,8 @@ library_name: transformers
|
|
| 46 |
|
| 47 |

|
| 48 |
|
| 49 |
-
# 🧠 NeuroBERT-Mini —
|
|
|
|
| 50 |
|
| 51 |
[](https://opensource.org/licenses/MIT)
|
| 52 |
[](#)
|
|
@@ -75,7 +76,7 @@ library_name: transformers
|
|
| 75 |
|
| 76 |
## Overview
|
| 77 |
|
| 78 |
-
`NeuroBERT-Mini` is a **lightweight** NLP model derived from **google/bert-base-uncased**, optimized for **real-time inference** on **edge and IoT devices**. With a quantized size of **~35MB** and **~
|
| 79 |
|
| 80 |
- **Model Name**: NeuroBERT-Mini
|
| 81 |
- **Size**: ~35MB (quantized)
|
|
@@ -394,8 +395,8 @@ To adapt NeuroBERT-Mini for custom IoT tasks (e.g., specific smart home commands
|
|
| 394 |
|
| 395 |
| Model | Parameters | Size | Edge/IoT Focus | Tasks Supported |
|
| 396 |
|-----------------|------------|--------|----------------|-------------------------|
|
| 397 |
-
| NeuroBERT-Mini | ~
|
| 398 |
-
| NeuroBERT-Tiny | ~
|
| 399 |
| DistilBERT | ~66M | ~200MB | Moderate | MLM, NER, Classification |
|
| 400 |
| TinyBERT | ~14M | ~50MB | Moderate | MLM, Classification |
|
| 401 |
|
|
|
|
| 46 |
|
| 47 |

|
| 48 |
|
| 49 |
+
# 🧠 NeuroBERT-Mini — Fast BERT for Edge AI, IoT & On-Device NLP 🚀
|
| 50 |
+
⚡ Built for low-latency, lightweight NLP tasks — perfect for smart assistants, microcontrollers, and embedded apps!
|
| 51 |
|
| 52 |
[](https://opensource.org/licenses/MIT)
|
| 53 |
[](#)
|
|
|
|
| 76 |
|
| 77 |
## Overview
|
| 78 |
|
| 79 |
+
`NeuroBERT-Mini` is a **lightweight** NLP model derived from **google/bert-base-uncased**, optimized for **real-time inference** on **edge and IoT devices**. With a quantized size of **~35MB** and **~10M parameters**, it delivers efficient contextual language understanding for resource-constrained environments like mobile apps, wearables, microcontrollers, and smart home devices. Designed for **low-latency** and **offline operation**, it’s ideal for privacy-first applications with limited connectivity.
|
| 80 |
|
| 81 |
- **Model Name**: NeuroBERT-Mini
|
| 82 |
- **Size**: ~35MB (quantized)
|
|
|
|
| 395 |
|
| 396 |
| Model | Parameters | Size | Edge/IoT Focus | Tasks Supported |
|
| 397 |
|-----------------|------------|--------|----------------|-------------------------|
|
| 398 |
+
| NeuroBERT-Mini | ~10M | ~35MB | High | MLM, NER, Classification |
|
| 399 |
+
| NeuroBERT-Tiny | ~5M | ~15MB | High | MLM, NER, Classification |
|
| 400 |
| DistilBERT | ~66M | ~200MB | Moderate | MLM, NER, Classification |
|
| 401 |
| TinyBERT | ~14M | ~50MB | Moderate | MLM, Classification |
|
| 402 |
|