Majuli 3.1
By Tripplet AI (Tripplet Artificial General Intelligence Research Institute)
Majuli 3.1 is a powerful 27B parameter multimodal language model built on the Gemma 3 architecture, optimized for creative writing, roleplay, and general-purpose instruction following.
Model Details
- Parameters: 28.8B
- Architecture: Gemma 3 (Gemma3ForConditionalGeneration)
- Context Length: 131,072 tokens
- Hidden Size: 5376
- Layers: 62
- Attention Heads: 32 (16 KV heads)
- Vision Encoder: SigLIP (896px, 27 layers)
- Languages: English, Russian
- Precision: bfloat16
Key Features
- Long context support up to 128K tokens
- Multimodal capabilities (image + text)
- Hybrid attention with sliding window (1024) and full attention layers
- Optimized for creative and roleplay tasks
Usage
from transformers import AutoProcessor, AutoModelForImageTextToText
model = AutoModelForImageTextToText.from_pretrained("tripplet-research/majuli-3.1")
processor = AutoProcessor.from_pretrained("tripplet-research/majuli-3.1")
messages = [
{"role": "user", "content": "Hello, tell me about yourself."}
]
inputs = processor.apply_chat_template(messages, return_tensors="pt", add_generation_prompt=True)
output = model.generate(**inputs, max_new_tokens=512)
print(processor.decode(output[0], skip_special_tokens=True))
About Tripplet AI
Tripplet Artificial General Intelligence Research Institute is dedicated to advancing the frontiers of artificial general intelligence through open research and model development.
License
Apache 2.0
- Downloads last month
- -
Model tree for tripplet-research/majuli3.1
Base model
OddTheGreat/Mars_27B_V.1