File size: 2,276 Bytes
10214b6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
62ad84f
 
 
 
 
 
 
 
10214b6
 
 
 
 
 
 
 
 
 
 
 
 
 
62ad84f
10214b6
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
license: apache-2.0
tags:
- ai
- blogging
- content-generation
- gemma
- stable-diffusion
- gguf
- blogen
---

# ๐Ÿ“ Blogen - Community Edition Models

This repository contains the quantized AI models used by **Blogen Community Edition**, a self-hosted, privacy-focused AI blogging assistant.

These models are optimized for local inference (CPU) using [GGUF](https://github.com/ggerganov/ggml) format.

## ๐Ÿ“ฆ Included Models

| Model Type | Model Name | Filename | Size | Description |
| :--- | :--- | :--- | :--- | :--- |
| **LLM** | **Google Gemma 3 12B IT** | `gemma-3-12b-it-q4_0.gguf` | ~4.7 GB | Instruction-tuned model for generating blog posts, titles, and SEO metadata. Quantized to 4-bit (Q4_0). |
| **Image Gen** | **Stable Diffusion v1.5** | `stable-diffusion-v1-5-pruned-emaonly-Q4_1.gguf` | ~2.0 GB | Text-to-Image model for generating blog cover images. Quantized to Q4_1 for `stable-diffusion.cpp`. |

## โœจ New Capabilities (v1.1)

These models power the latest version of Blogen, enabling:

*   **๐ŸŒ Multilingual Blogging**: Native support for generating content in Spanish, French, German, and 50+ languages via Gemma 3 instructions.
*   **๐ŸŽจ High-Fidelity Images**: Optimized Stable Diffusion pipeline with 30-step generation for clearer, artifact-free cover images.
*   **๐Ÿ›ก๏ธ Enterprise Grade**: Ready for secure, air-gapped deployments with Ed25519 license verification.

## ๐Ÿš€ Usage

These models are designed to be automatically downloaded by the **Blogen** Docker container upon startup.

### Manual Download & Run
If you prefer to download them manually (e.g., to save bandwidth on re-deployments):

1. **Download the files** to a local folder (e.g., `./models`).
2. **Run Blogen Community Edition**:
   ```bash
   docker run -d \
     -p 3000:3000 \
     -v $(pwd)/models:/app/models \
     -v $(pwd)/data:/app/data \
     ghcr.io/org-runink/blogen/server:free
   ```

## โš–๏ธ License & Acknowledgments

*   **Blogen CE**: Apache 2.0
*   **Gemma 3**: [Gemma Terms of Use](https://ai.google.dev/gemma/terms) (Google)
*   **Stable Diffusion**: [CreativeML Open RAIL-M](https://huggingface.co/runwayml/stable-diffusion-v1-5) (RunwayML / Stability AI)

*These files are quantized redistributions of the original models cited above.*