Commit
·
9823d64
1
Parent(s):
2ff05e2
add README.md
Browse files- .assets/lightning-logo2.svg +3 -0
- .assets/lightning.svg +0 -0
- .assets/preview.gif +3 -0
- README.md +60 -0
.assets/lightning-logo2.svg
ADDED
|
|
.assets/lightning.svg
ADDED
|
|
.assets/preview.gif
ADDED
|
Git LFS Details
|
README.md
ADDED
|
@@ -0,0 +1,60 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
base_model:
|
| 6 |
+
- Qwen/Qwen3-4B-Thinking-2507
|
| 7 |
+
pipeline_tag: text-generation
|
| 8 |
+
library_name: transformers
|
| 9 |
+
---
|
| 10 |
+
# Lightning-4b - Your Local data analysis agent
|
| 11 |
+
|
| 12 |
+
[](https://github.com/quelmap-inc/quelmap)
|
| 13 |
+
[](https://opensource.org/licenses/Apache-2.0)
|
| 14 |
+
|
| 15 |
+

|
| 16 |
+
|
| 17 |
+
## Overview
|
| 18 |
+
Lightning-4b is a language model specifically designed and trained for data analysis tasks on local devices. With just a laptop (fully tested on an M4 MacBook Air with 16GB RAM), you can process data without ever sending it to major LLM provider.
|
| 19 |
+
### What it can do
|
| 20 |
+
- Data visualization
|
| 21 |
+
- Table joins
|
| 22 |
+
- t-tests
|
| 23 |
+
- Unlimited rows, 30+ tables analyzed simultaneously
|
| 24 |
+
### What it cannot do
|
| 25 |
+
- Business reasoning or management decision-making advice
|
| 26 |
+
- Multi-turn analysis
|
| 27 |
+
|
| 28 |
+
To use this model, install [quelmap](https://github.com/quelmap-inc/quelmap) on your device.
|
| 29 |
+
Quelmap is an open-source data analysis assistant with every essential features like data upload and an built-in python sandbox.
|
| 30 |
+
For installation instructions, see the [Quick Start](https://github.com/quelmap-inc/quelmap).
|
| 31 |
+

|
| 32 |
+
### Performance
|
| 33 |
+
This model was trained specifically for use with [quelmap](https://github.com/quelmap-inc/quelmap).
|
| 34 |
+
It was evaluated using a sample database and 122 analysis queries, and achieved performance surpassing models with **50x more parameters**.
|
| 35 |
+
|
| 36 |
+
For details about the model and its training process, see the [Lightning-4b Details](https://quelmap.com/lightinig-4b) page.
|
| 37 |
+
|
| 38 |
+

|
| 39 |
+
|
| 40 |
+
### Running Model on your machine
|
| 41 |
+
You can easily install Lightning-4b and quelmap by following the [Quick Start](https://github.com/quelmap-inc/quelmap).
|
| 42 |
+
|
| 43 |
+
Lightning-4b has multiple quantization versions depending on your hardware.
|
| 44 |
+
It runs smoothly on laptops, and on higher-spec machines it can handle more tables (30+ tables) and longer chat histories.
|
| 45 |
+
|
| 46 |
+
Example Specs and Model Versions
|
| 47 |
+
- Laptop (e.g. mac book air 16GB) - 4bit Quantization + 10,240 Context Window
|
| 48 |
+
```
|
| 49 |
+
ollama pull hf.co/quelmap/Lightning-4b-GGUF-short-ctx:Q4_K_M
|
| 50 |
+
```
|
| 51 |
+
- Gaming Laptop - 4bit Quantization + 40,960 Context Window
|
| 52 |
+
```
|
| 53 |
+
ollama pull hf.co/quelmap/Lightning-4b-GGUF:Q4_K_M
|
| 54 |
+
```
|
| 55 |
+
- Powerful PC with GPU - No Quantization + 40,960 Context Window
|
| 56 |
+
```
|
| 57 |
+
ollama pull hf.co/quelmap/Lightning-4b-GGUF:F16
|
| 58 |
+
```
|
| 59 |
+
|
| 60 |
+
For more details, please refer to the [Lightning-4b Details](https://quelmap.com/lightinig-4b) page.
|