Update README.md
Browse files
README.md
CHANGED
|
@@ -27,7 +27,7 @@ Unlike traditional multi-stage pipelines that are prone to cascading error propa
|
|
| 27 |
## 🚀 Model Highlights
|
| 28 |
- **Joint Extraction:** Unified NER + RE reducing pipeline complexity.
|
| 29 |
- **Ontology-Adaptive:** Zero-shot adaptation to diverse domains (Astronomy, Music, Healthcare, etc.) via dynamic schema variables.
|
| 30 |
-
- **Local & Private:** Optimized for **local CPU-only inference** (via GGUF/Ollama), ensuring data sovereignty without external API dependencies.
|
| 31 |
- **Instruction Aligned:** Fine-tuned to follow strict negative constraints, ensuring zero conversational filler in outputs.
|
| 32 |
|
| 33 |
## 🛠 Methodology
|
|
@@ -112,9 +112,11 @@ outputs = llm.chat(batch_prompts, sampling_params=sampling_params, use_tqdm=True
|
|
| 112 |
|
| 113 |
### 4. 📦 Deployment & Hardware Requirements
|
| 114 |
|
| 115 |
-
Deployment Mode
|
| 116 |
-
|
| 117 |
-
|
|
|
|
|
|
|
| 118 |
|
| 119 |
For CPU-only local execution, refer to the GGUF version: phi4_adaptableIE_v2-gguf📜
|
| 120 |
|
|
|
|
| 27 |
## 🚀 Model Highlights
|
| 28 |
- **Joint Extraction:** Unified NER + RE reducing pipeline complexity.
|
| 29 |
- **Ontology-Adaptive:** Zero-shot adaptation to diverse domains (Astronomy, Music, Healthcare, etc.) via dynamic schema variables.
|
| 30 |
+
- **Local & Private:** Optimized for **local CPU-only inference** (via GGUF/Ollama - FinaPolat/phi4_adaptableIE_v2-gguf ), ensuring data sovereignty without external API dependencies.
|
| 31 |
- **Instruction Aligned:** Fine-tuned to follow strict negative constraints, ensuring zero conversational filler in outputs.
|
| 32 |
|
| 33 |
## 🛠 Methodology
|
|
|
|
| 112 |
|
| 113 |
### 4. 📦 Deployment & Hardware Requirements
|
| 114 |
|
| 115 |
+
| Deployment Mode | Quantization | Hardware Requirement | Target Latency |
|
| 116 |
+
|-----------------|--------------|------------------------------------------|----------------|
|
| 117 |
+
| Server-side | BF16 | 1× NVIDIA A100 / RTX 4090 (24GB+) | Ultra-Low |
|
| 118 |
+
| Local Consumer | 4-bit GGUF | 16GB RAM (Apple Silicon / PC CPU) | Moderate |
|
| 119 |
+
|
| 120 |
|
| 121 |
For CPU-only local execution, refer to the GGUF version: phi4_adaptableIE_v2-gguf📜
|
| 122 |
|