bert-lite / README.md
boltuix's picture
Update README.md
9f3cb89 verified
|
raw
history blame
4.7 kB
---
license: mit
datasets:
- wikimedia/wikipedia
- bookcorpus/bookcorpus
- SetFit/mnli
- sentence-transformers/all-nli
language:
- en
new_version: v1.1
base_model:
- google-bert/bert-base-uncased
pipeline_tag: text-classification
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
- nlp
- tiny-bert
- edge-ai
- transformers
- low-resource
- micro-nlp
- quantized
- iot
- wearable-ai
- offline-assistant
- intent-detection
- real-time
- smart-home
- embedded-systems
- command-classification
- toy-robotics
- voice-ai
- eco-ai
- english
- lightweight
- mobile-nlp
metrics:
- accuracy
- f1
- inference
- recall
library_name: transformers
---
![Banner](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWsG0Nmwt7QDnCpZuNrWGRaDGURIV9QWifhhaDbBDaCb0wPEeGQidUl-jgE-GC21QDa-3WXgpM6y9OTWjvhnpho9nDmDNf3MiHqhs-sfhwn-Rphj3FtASbbQMxyPx9agHSib-GPj18nAxkYonB6hOqCDAj0zGis2qICirmYI8waqxTo7xNtZ6Ju3yLQM8/s1920/bert-%20lite.png)
# 🌟 bert-lite: A Lightweight BERT for Efficient NLP 🌟
## πŸš€ Overview
Meet **bert-lite**β€”a streamlined marvel of NLP! πŸŽ‰ Designed with efficiency in mind, this model features a compact architecture tailored for tasks like **MNLI** and **NLI**, while excelling in low-resource environments. With a lightweight footprint, `bert-lite` is perfect for edge devices, IoT applications, and real-time NLP needs. 🌍
---
## 🌟 Why bert-lite? The Lightweight Edge
- πŸ” **Compact Power**: Optimized for speed and size
- ⚑ **Fast Inference**: Blazing quick on constrained hardware
- πŸ’Ύ **Small Footprint**: Minimal storage demands
- 🌱 **Eco-Friendly**: Low energy consumption
- 🎯 **Versatile**: IoT, wearables, smart homes, and more!
---
## 🧠 Model Details
| Property | Value |
|-------------------|------------------------------------|
| 🧱 Layers | Custom lightweight design |
| 🧠 Hidden Size | Optimized for efficiency |
| πŸ‘οΈ Attention Heads | Minimal yet effective |
| βš™οΈ Parameters | Ultra-low parameter count |
| πŸ’½ Size | Quantized for minimal storage |
| 🌐 Base Model | google-bert/bert-base-uncased |
| πŸ†™ Version | v1.1 (April 04, 2025) |
---
## πŸ“œ License
MIT License β€” free to use, modify, and share.
## πŸ”€ Usage Example – Masked Language Modeling (MLM)
```python
from transformers import pipeline
# πŸ“’ Start demo
print("\nπŸ”€ Masked Language Model (MLM) Demo")
# 🧠 Load masked language model : eg boltuix/bert-lite
mlm_pipeline = pipeline("fill-mask", model="boltuix/bert-lite")
# ✍️ Masked sentences
masked_sentences = [
"The robot can [MASK] the room in minutes.",
"He decided to [MASK] the project early.",
"This device is [MASK] for small tasks.",
"The weather will [MASK] by tomorrow.",
"She loves to [MASK] in the garden.",
"Please [MASK] the door before leaving.",
]
# πŸ€– Predict missing words
for sentence in masked_sentences:
print(f"\nInput: {sentence}")
predictions = mlm_pipeline(sentence)
for pred in predictions[:3]:
print(f"✨ β†’ {pred['sequence']} (score: {pred['score']:.4f})")
```
---
## πŸ”€ Masked Language Model (MLM) Demo
Input: The robot can [MASK] the room in minutes.
✨ β†’ The robot can clean the room in minutes. (score: 0.3124)
✨ β†’ The robot can scan the room in minutes. (score: 0.1547)
✨ β†’ The robot can paint the room in minutes. (score: 0.0983)
Input: He decided to [MASK] the project early.
✨ β†’ He decided to finish the project early. (score: 0.3876)
✨ β†’ He decided to start the project early. (score: 0.2109)
✨ β†’ He decided to abandon the project early. (score: 0.0765)
Input: This device is [MASK] for small tasks.
✨ β†’ This device is perfect for small tasks. (score: 0.2458)
✨ β†’ This device is great for small tasks. (score: 0.1894)
✨ β†’ This device is useful for small tasks. (score: 0.1321)
Input: The weather will [MASK] by tomorrow.
✨ β†’ The weather will improve by tomorrow. (score: 0.2987)
✨ β†’ The weather will change by tomorrow. (score: 0.1765)
✨ β†’ The weather will clear by tomorrow. (score: 0.1034)
Input: She loves to [MASK] in the garden.
✨ β†’ She loves to work in the garden. (score: 0.3542)
✨ β†’ She loves to play in the garden. (score: 0.1986)
✨ β†’ She loves to relax in the garden. (score: 0.0879)
Input: Please [MASK] the door before leaving.
✨ β†’ Please close the door before leaving. (score: 0.4673)
✨ β†’ Please lock the door before leaving. (score: 0.3215)
✨ β†’ Please open the door before leaving. (score: 0.0652)