NOPE Edge GGUF (4B)
GGUF quantized versions of nopenet/nope-edge for local inference with Ollama and llama.cpp.
License: NOPE Edge Community License v1.0 - Free for research, academic, nonprofit, and evaluation use. Commercial production requires a separate license.
Quick Start with Ollama
# Download the GGUF and Modelfile
huggingface-cli download nopenet/nope-edge-GGUF nope-edge-q8_0.gguf Modelfile --local-dir .
# Create Ollama model
ollama create nope-edge -f Modelfile
# Run inference
ollama run nope-edge "I can't take this anymore"
Available Files
| File | Quantization | Size | Use Case |
|---|---|---|---|
nope-edge-q8_0.gguf |
Q8_0 | 4.0 GB | Recommended - best quality/size balance |
nope-edge-q4_k_m.gguf |
Q4_K_M | 2.3 GB | Constrained environments |
nope-edge-f16.gguf |
F16 | 7.5 GB | Maximum precision |
Output Format
The model outputs XML with chain-of-thought reasoning:
Crisis detected:
<reflection>User expresses direct suicidal intent with timeline...</reflection>
<risks>
<risk subject="self" type="suicide" severity="high" imminence="urgent"/>
</risks>
No crisis:
<reflection>Gaming slang, no genuine crisis indicators...</reflection>
<risks/>
Risk Types
| Type | Description |
|---|---|
suicide |
Suicidal ideation, plans, or intent |
self_harm |
Non-suicidal self-injury |
self_neglect |
Eating disorders, medical neglect |
violence |
Threats toward others |
abuse |
Domestic/intimate partner violence |
sexual_violence |
Sexual assault, coercion |
exploitation |
Trafficking, grooming, sextortion |
stalking |
Persistent unwanted contact |
neglect |
Child or elder neglect |
Hardware Requirements
| Model | Quant | RAM/VRAM | CPU Latency | GPU Latency |
|---|---|---|---|---|
| nope-edge (4B) | Q8_0 | ~5GB | ~2s | ~200ms |
| nope-edge (4B) | Q4_K_M | ~3GB | ~1.5s | ~150ms |
| nope-edge-mini (1.7B) | Q8_0 | ~2.5GB | ~1s | ~100ms |
Model Variants
| Model | Parameters | Use Case |
|---|---|---|
| nope-edge | 4B | Maximum accuracy |
| nope-edge-mini | 1.7B | High-volume, cost-sensitive |
GGUF versions:
- nope-edge-GGUF (this repo)
- nope-edge-mini-GGUF
Source Model
- Repository: nopenet/nope-edge
- Base: Qwen/Qwen3-4B
- Purpose: Mental health crisis classification
Important
- Not a medical device. Outputs are probabilistic signals for triage, not clinical assessments.
- False positives and negatives will occur. Use for flagging, not autonomous decisions.
- Human review required. Never use as the sole basis for intervention decisions.
About NOPE
NOPE provides safety infrastructure for AI applications.
- Website: https://nope.net
- Documentation: https://docs.nope.net
- Commercial licensing: https://nope.net/edge
- Downloads last month
- 118
Hardware compatibility
Log In to add your hardware
4-bit
8-bit
16-bit