It's a gguf model file of gemma-3-12b-it, which is developed by Google.

It's very applicable for deploying and using in PCs, laptops or mobiles.
gemma-3-12b-it-q4_0.gguf is the quantization-aware trained(QAT) checkpoints of Gemma 3, 3x less VRAM, while retaining almost the same quality. Recommend it.

Useful local intelligent documents assistant AI tools:

MyDocs is a desktop App which supports Windows and MacOS, and especially for professionals who demand absolute privacy. 100% local processing ensuring total documents sovereignty and zero-cloud dependency.

If you are using Zotero for managing and reading your personal PDFs, PapersGPT is a free plugin which can assist you to chat PDFs effectively by your local gemma-3-12b-it. you can download ChatPDFLocal MacOS app from here, load one or batch PDF files at will, and quickly experience the effect of the model through chat reading.

Downloads last month
1,070
GGUF
Model size
12B params
Architecture
gemma3
Hardware compatibility
Log In to add your hardware

4-bit

8-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support