TensorTemplar's picture
Initial upload: FP8 Mistral 3.1 for FLUX.2 text encoding
5f347ba verified
metadata
license: apache-2.0
base_model:
  - mistralai/Mistral-Small-3.1-24B-Instruct-2503
  - RedHatAI/Mistral-Small-3.1-24B-Instruct-2503-FP8-dynamic
tags:
  - flux2
  - text-encoder
  - fp8
  - mistral

FLUX.2 Text Encoder (FP8)

Combined repo for FLUX.2 text encoding with FP8 quantization (~24GB VRAM instead of ~48GB).

Components

Usage

from transformers import AutoProcessor, Mistral3ForConditionalGeneration

model = Mistral3ForConditionalGeneration.from_pretrained(
    "TensorTemplar/flux2-text-encoder-fp8",
    local_files_only=True,  # or False for download
)
processor = AutoProcessor.from_pretrained(
    "TensorTemplar/flux2-text-encoder-fp8",
    use_fast=False,
)

Purpose

This repo exists to simplify FLUX.2 deployment by combining all necessary text encoder components into a single download. Used for extracting intermediate hidden states (layers 10/20/30) for image generation conditioning.

Attribution