Text Classification
Transformers
Safetensors
English
bert
fill-mask
BERT
transformer
nlp
bert-lite
edge-ai
low-resource
micro-nlp
quantized
iot
wearable-ai
offline-assistant
intent-detection
real-time
smart-home
embedded-systems
command-classification
toy-robotics
voice-ai
eco-ai
english
lightweight
mobile-nlp
ner
on-device-nlp
privacy-first
cpu-inference
speech-intent
offline-nlp
tiny-bert
bert-variant
efficient-nlp
edge-ml
tiny-ml
aiot
embedded-nlp
low-latency
smart-devices
edge-inference
ml-on-microcontrollers
android-nlp
offline-chatbot
esp32-nlp
tflite-compatible
text-embeddings-inference
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,105 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: mit
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
datasets:
|
| 4 |
+
- wikimedia/wikipedia
|
| 5 |
+
- bookcorpus/bookcorpus
|
| 6 |
+
language:
|
| 7 |
+
- en
|
| 8 |
+
new_version: v1.1
|
| 9 |
+
base_model:
|
| 10 |
+
- google-bert/bert-base-uncased
|
| 11 |
+
pipeline_tag: text-classification
|
| 12 |
+
tags:
|
| 13 |
+
- BERT
|
| 14 |
+
- MNLI
|
| 15 |
+
- NLI
|
| 16 |
+
- transformer
|
| 17 |
+
- pre-training
|
| 18 |
+
- nlp
|
| 19 |
+
- tiny-bert
|
| 20 |
+
- edge-ai
|
| 21 |
+
- transformers
|
| 22 |
+
- low-resource
|
| 23 |
+
- micro-nlp
|
| 24 |
+
- quantized
|
| 25 |
+
- iot
|
| 26 |
+
- wearable-ai
|
| 27 |
+
- offline-assistant
|
| 28 |
+
- intent-detection
|
| 29 |
+
- real-time
|
| 30 |
+
- smart-home
|
| 31 |
+
- embedded-systems
|
| 32 |
+
- command-classification
|
| 33 |
+
- toy-robotics
|
| 34 |
+
- voice-ai
|
| 35 |
+
- eco-ai
|
| 36 |
+
- english
|
| 37 |
+
- lightweight
|
| 38 |
+
- mobile-nlp
|
| 39 |
+
metrics:
|
| 40 |
+
- accuracy
|
| 41 |
+
- f1
|
| 42 |
+
- inference
|
| 43 |
+
- recall
|
| 44 |
+
library_name: transformers
|
| 45 |
+
---
|
| 46 |
+
|
| 47 |
+

|
| 48 |
+
|
| 49 |
+
# π bert-lite: A Lightweight BERT for Efficient NLP π
|
| 50 |
+
|
| 51 |
+
## π Overview
|
| 52 |
+
Meet **bert-lite**βa streamlined marvel of NLP! π Designed with efficiency in mind, this model features a compact architecture tailored for tasks like **MNLI** and **NLI**, while excelling in low-resource environments. With a lightweight footprint, `bert-lite` is perfect for edge devices, IoT applications, and real-time NLP needs. π
|
| 53 |
+
|
| 54 |
+
---
|
| 55 |
+
|
| 56 |
+
## π Why bert-lite? The Lightweight Edge
|
| 57 |
+
- π **Compact Power**: Optimized for speed and size
|
| 58 |
+
- β‘ **Fast Inference**: Blazing quick on constrained hardware
|
| 59 |
+
- πΎ **Small Footprint**: Minimal storage demands
|
| 60 |
+
- π± **Eco-Friendly**: Low energy consumption
|
| 61 |
+
- π― **Versatile**: IoT, wearables, smart homes, and more!
|
| 62 |
+
|
| 63 |
+
---
|
| 64 |
+
|
| 65 |
+
## π§ Model Details
|
| 66 |
+
|
| 67 |
+
| Property | Value |
|
| 68 |
+
|-------------------|------------------------------------|
|
| 69 |
+
| π§± Layers | Custom lightweight design |
|
| 70 |
+
| π§ Hidden Size | Optimized for efficiency |
|
| 71 |
+
| ποΈ Attention Heads | Minimal yet effective |
|
| 72 |
+
| βοΈ Parameters | Ultra-low parameter count |
|
| 73 |
+
| π½ Size | Quantized for minimal storage |
|
| 74 |
+
| π Base Model | google-bert/bert-base-uncased |
|
| 75 |
+
| π Version | v1.1 (April 04, 2025) |
|
| 76 |
+
|
| 77 |
+
---
|
| 78 |
+
|
| 79 |
+
## π€ Usage Example β Masked Language Modeling (MLM)
|
| 80 |
+
|
| 81 |
+
```python
|
| 82 |
+
from transformers import pipeline
|
| 83 |
+
|
| 84 |
+
# π’ Start demo
|
| 85 |
+
print("\nπ€ Masked Language Model (MLM) Demo")
|
| 86 |
+
|
| 87 |
+
# π§ Load masked language model
|
| 88 |
+
mlm_pipeline = pipeline("fill-mask", model="bert-base-uncased")
|
| 89 |
+
|
| 90 |
+
# βοΈ Masked sentences
|
| 91 |
+
masked_sentences = [
|
| 92 |
+
"The robot can [MASK] the room in minutes.",
|
| 93 |
+
"He decided to [MASK] the project early.",
|
| 94 |
+
"This device is [MASK] for small tasks.",
|
| 95 |
+
"The weather will [MASK] by tomorrow.",
|
| 96 |
+
"She loves to [MASK] in the garden.",
|
| 97 |
+
"Please [MASK] the door before leaving.",
|
| 98 |
+
]
|
| 99 |
+
|
| 100 |
+
# π€ Predict missing words
|
| 101 |
+
for sentence in masked_sentences:
|
| 102 |
+
print(f"\nInput: {sentence}")
|
| 103 |
+
predictions = mlm_pipeline(sentence)
|
| 104 |
+
for pred in predictions[:3]:
|
| 105 |
+
print(f"β¨ β {pred['sequence']} (score: {pred['score']:.4f})")
|