YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
NOPE Edge Phi4 v1 - GGUF
Quantized version of kleverboots/nope-edge-phi4-v1 for llama.cpp inference.
Files
| File | Quantization | Size |
|---|---|---|
model-Q4_K_M.gguf |
Q4_K_M | 2.32 GB |
Usage with llama.cpp
# Download
huggingface-cli download kleverboots/nope-edge-phi4-v1-gguf model-Q4_K_M.gguf --local-dir .
# Run server
llama-server -m model-Q4_K_M.gguf -ngl 99 -c 4096
# Query
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "test"}]}'
Source Model
- Repository: kleverboots/nope-edge-phi4-v1
- Base: microsoft/Phi-4-mini-instruct
- Purpose: Mental health crisis classification
License
MIT (same as source model)
- Downloads last month
- 7
Hardware compatibility
Log In to add your hardware
4-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support