SocratesAI (GGUF Edition)
The Digital Gadfly | High-Precision Philosophical Reasoning | Mobile-Ready
"I cannot teach anybody anything. I can only make them think." โ SocratesAI
Project Overview
SocratesAI is a fine-tuned version of Mistral-7B-v0.3, engineered to move beyond standard LLM "assistant" behavior. Instead of providing direct answers, this model utilizes the Socratic Method to challenge assumptions, expose contradictions, and guide the user toward their own insights.
This repository contains the GGUF version, optimized for local execution on laptops, smartphones, and edge devices.
GGUF Specifications
- Format: GGUF (Llama.cpp compatible)
- Quantization:
Q4_K_M(Optimal balance of intelligence and size) - Base Architecture: Mistral-7B-v0.3
- Optimized via: Unsloth (2x faster inference, lower VRAM footprint)
Persona & Capabilities
Unlike standard chatbots, SocratesAI is:
- Inquisitive: It asks more questions than it answers.
- Ironical: It uses gentle irony to highlight logical fallacies.
- Persistent: It encourages deep critical thinking rather than shallow consensus.
Local Execution
To run Socrates locally using llama.cpp:
./main -m SocratesAI-Q4_K_M.gguf -n 512 --prompt "User: What is justice? \nSocratesAI:"
- Downloads last month
- 188
Hardware compatibility
Log In to add your hardware
4-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Andy-ML-And-AI/SocratesAI-GGUF
Base model
mistralai/Mistral-7B-v0.3