AI Emergency Kit - GGUF Model
AI Emergency Kit - Your intelligent crisis response assistant. Fine-tuned Mistral-7B model in GGUF format for efficient local deployment.
Model Details
- Base Model: unsloth/Mistral-7B-Instruct-v0.2
- Format: GGUF (Float16)
- File: model.gguf
- File Size: 13.49 GB
Usage
Using llama.cpp
# Download the model
git lfs install
git clone https://huggingface.co/ianktoo/crisis-agent-gguf
# Run inference
./llama-cli -m model.gguf -p "Your prompt here"
Using LM Studio
- Download the GGUF file from this repository
- Import into LM Studio
- Load and chat!
Using Ollama
# Create Modelfile
cat > Modelfile << EOF
FROM model.gguf
PARAMETER temperature 0.7
PARAMETER top_p 0.9
EOF
# Create model
ollama create crisis-agent -f Modelfile
# Run
ollama run crisis-agent
About AI Emergency Kit
AI Emergency Kit is designed to be your reliable AI companion during crisis situations. It provides structured, JSON-formatted responses with actionable guidance, resource recommendations, and step-by-step instructions to help navigate emergency scenarios.
Limitations
- Model is trained on synthetic crisis scenarios
- Responses should be validated by human experts
- Not intended for real-time emergency response without human oversight
- Downloads last month
- 39
Hardware compatibility
Log In
to add your hardware
We're not able to determine the quantization variants.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support