# About Me - Akicou Hi, I'm **Akicou**, and I specialize in sharing **quantized Large Language Models (LLMs)** in **GGUF format**. These models are processed using my own service, **[GGUFORGE](https://gguforge.com)**, which focuses on converting raw models into GGUF format for more efficient usage. ### ⚠️ Important Notes: - I **only verify** model outputs **after explicitly testing** them. If I haven't stated that a model works, I **cannot guarantee** its functionality in raw `safetensors` or any other format. - Models are **deleted within 5 days** of upload unless I consider them worth keeping. - My current goal is to **reduce large SOTA models** while maintaining their quality and avoiding garbage output. This is still a learning process, and I'm constantly experimenting with new methods to improve results. --- ## 🧰 About GGUFORGE (My GGUF Conversion Service) **[GGUFORGE](https://gguforge.com)** is a self-hosted platform where users can: - Log in via **Hugging Face OAuth** - Request **model to GGUF conversions** ### 📌 Important Information: - **Speed**: I cannot guarantee fast processing due to self-hosting constraints, but I aim to handle requests as soon as possible. - **Model Quality**: If the GGUF output performs poorly, it's likely due to the original model's training or fine-tuning, **not the GGUF conversion process**. - **Request Approval**: I may deny requests if similar GGUFs are already widely available. - **Uptime & Data**: I cannot guarantee 100% uptime. The database may reset as I experiment with different hosting providers. However, if your request is lost, I will restore it manually. - **Self-Hosting**: You're welcome to host GGUFORGE yourself using the source code: [https://github.com/Akicuo/automaticConversion](https://github.com/Akicuo/automaticConversion) --- Thank you for visiting, and I hope you find the models and tools I share useful!