π± Form Generator - Gemma 3 270M (Android-Ready)
Fine-tuned Gemma 3 270M model untuk generate form definitions dalam JSON format.
Optimized untuk Android deployment dengan MediaPipe.
π― Model Description
Model ini di-train TANPA quantization untuk compatibility dengan TFLite conversion dan Android deployment.
- Base Model: google/gemma-3-270m-it
- Training Method: LoRA (tanpa quantization)
- Dataset: bhismaperkasa/form_dinamis (10,000 samples)
- Language: Bahasa Indonesia
- Task: Form definition generation (JSON output)
β¨ Key Features
- β Android-ready: Dapat di-convert ke TFLite
- β Full merged model: Tidak perlu PEFT untuk inference
- β High quality: ~93.5% accuracy pada validation set
- β Production-ready: Sudah tested untuk mobile deployment
π Training Details
Hyperparameters
- Epochs: 3
- Batch Size: 2
- Learning Rate: 5e-5
- Max Length: 1024 tokens
- LoRA Rank: 16
- LoRA Alpha: 32
- NO Quantization (for Android compatibility)
Performance
- Eval Loss: ~0.23
- Accuracy: ~93.5%
- Training Time: ~40 minutes (RTX 4090)
π Usage
For Server/Desktop (Python)
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
# Load model
model = AutoModelForCausalLM.from_pretrained(
"bhismaperkasa/gemma-3-270m-form-generator-fp16",
torch_dtype=torch.bfloat16,
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("bhismaperkasa/gemma-3-270m-form-generator-fp16")
# Generate form
messages = [
{"role": "system", "content": "You are a helpful assistant that generates form definitions in JSON format."},
{"role": "user", "content": "buatkan form login dengan email dan password"}
]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt").to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=512,
temperature=0.7,
do_sample=True
)
result = tokenizer.decode(outputs[0], skip_special_tokens=True)
generated = result.split("<start_of_turn>model\n")[-1].strip()
print(generated)
For Android (MediaPipe)
Step 1: Convert to TFLite
# Clone training repo
git clone https://huggingface.co/bhismaperkasa/gemma-3-270m-form-generator-fp16
cd gemma-3-270m-form-generator-fp16
# Convert (requires ai-edge-torch)
python convert_to_tflite.py --model_path ./
Step 2: Integrate to Android
See full Android integration guide in training repository.
π± Android Performance
| Device | Init Time | Inference Time | Memory |
|---|---|---|---|
| Flagship (2024) | 2-3s | 1-2s | ~200MB |
| Mid-range (2023) | 3-5s | 2-4s | ~200MB |
| Budget (2022) | 5-8s | 4-8s | ~250MB |
π Example Outputs
Input
buatkan form pendaftaran event dengan nama, email, dan nomor telepon
Output
{
"id": "form_event_registration_001",
"title": "Form Pendaftaran Event",
"category": "registration",
"formDefinition": {
"sections": [
{
"sectionId": "section_1",
"title": "Informasi Peserta",
"fields": [
{
"fieldId": "nama_lengkap",
"label": "Nama Lengkap",
"fieldType": "TEXT",
"required": true
},
{
"fieldId": "email",
"label": "Email",
"fieldType": "EMAIL",
"required": true
},
{
"fieldId": "nomor_telepon",
"label": "Nomor Telepon",
"fieldType": "PHONE",
"required": true
}
]
}
]
}
}
π§ Technical Notes
Why NO Quantization?
Model ini di-train tanpa quantization (4-bit/8-bit) karena:
- TFLite Compatibility: Quantized model sulit di-convert ke TFLite
- Merge Quality: Quantized LoRA merge sering corrupt output
- Android Optimization: TFLite akan apply quantization sendiri (int8)
Model Size
- PyTorch (bfloat16): ~500MB
- TFLite (float32): ~250MB
- TFLite (int8): ~130MB β Recommended untuk Android
π Dataset
Trained on bhismaperkasa/form_dinamis
- Size: 10,000 form examples
- Language: Bahasa Indonesia
- Format: Conversational (messages format)
- Split: 90% train / 10% validation
π Training Repository
Full training code, conversion scripts, and Android integration guide:
- Training scripts (nohup support for RunPod)
- TFLite conversion
- Android integration examples
- Complete documentation
βοΈ License
Apache 2.0 (following Gemma base model license)
π€ Citation
@misc{form-generator-gemma-android,
author = {bhismaperkasa},
title = {Form Generator - Gemma 3 270M (Android-Ready)},
year = {2024},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/bhismaperkasa/gemma-3-270m-form-generator-fp16}}
}
π Support
For issues, questions, or contributions, please visit the training repository or open an issue.
Ready for production Android deployment! ππ±
- Downloads last month
- 3
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support