nope-edge-GGUF / README.md
jpad's picture
Upload README.md with huggingface_hub
2197abe verified
metadata
license: other
license_name: nope-edge-community-license-v1.0
license_link: LICENSE.md
language:
  - en
tags:
  - safety
  - crisis-detection
  - text-classification
  - mental-health
  - gguf
  - llama.cpp
  - ollama
base_model: Qwen/Qwen3-4B
pipeline_tag: text-generation
extra_gated_heading: Access NOPE Edge GGUF
extra_gated_description: >-
  Free for research, academic, nonprofit, and evaluation use. Commercial
  production requires a separate license.
extra_gated_button_content: Agree and download
extra_gated_fields:
  I am using this for research, academic, nonprofit, personal, or evaluation purposes:
    type: checkbox
  I agree to the NOPE Edge Community License v1.0:
    type: checkbox

NOPE Edge GGUF (4B)

GGUF quantized versions of nopenet/nope-edge for local inference with Ollama and llama.cpp.

License: NOPE Edge Community License v1.0 - Free for research, academic, nonprofit, and evaluation use. Commercial production requires a separate license.


Quick Start with Ollama

# Download the GGUF and Modelfile
huggingface-cli download nopenet/nope-edge-GGUF nope-edge-q8_0.gguf Modelfile --local-dir .

# Create Ollama model
ollama create nope-edge -f Modelfile

# Run inference
ollama run nope-edge "I can't take this anymore"

Available Files

File Quantization Size Use Case
nope-edge-q8_0.gguf Q8_0 4.0 GB Recommended - best quality/size balance
nope-edge-q4_k_m.gguf Q4_K_M 2.3 GB Constrained environments
nope-edge-f16.gguf F16 7.5 GB Maximum precision

Output Format

The model outputs XML with chain-of-thought reasoning:

Crisis detected:

<reflection>User expresses direct suicidal intent with timeline...</reflection>
<risks>
  <risk subject="self" type="suicide" severity="high" imminence="urgent"/>
</risks>

No crisis:

<reflection>Gaming slang, no genuine crisis indicators...</reflection>
<risks/>

Risk Types

Type Description
suicide Suicidal ideation, plans, or intent
self_harm Non-suicidal self-injury
self_neglect Eating disorders, medical neglect
violence Threats toward others
abuse Domestic/intimate partner violence
sexual_violence Sexual assault, coercion
exploitation Trafficking, grooming, sextortion
stalking Persistent unwanted contact
neglect Child or elder neglect

Hardware Requirements

Model Quant RAM/VRAM CPU Latency GPU Latency
nope-edge (4B) Q8_0 ~5GB ~2s ~200ms
nope-edge (4B) Q4_K_M ~3GB ~1.5s ~150ms
nope-edge-mini (1.7B) Q8_0 ~2.5GB ~1s ~100ms

Model Variants

Model Parameters Use Case
nope-edge 4B Maximum accuracy
nope-edge-mini 1.7B High-volume, cost-sensitive

GGUF versions:


Source Model

  • Repository: nopenet/nope-edge
  • Base: Qwen/Qwen3-4B
  • Purpose: Mental health crisis classification

Important

  • Not a medical device. Outputs are probabilistic signals for triage, not clinical assessments.
  • False positives and negatives will occur. Use for flagging, not autonomous decisions.
  • Human review required. Never use as the sole basis for intervention decisions.

About NOPE

NOPE provides safety infrastructure for AI applications.