license: apache-2.0
base_model: google/functiongemma-270m-it
tags:
- tool-calling
- smart-home
- function-calling
- distil-labs
language:
- en
pipeline_tag: text-generation
library_name: transformers
Distil-Home-Assistant-FunctionGemma
A fine-tuned FunctionGemma model (Gemma3 architecture) for multi-turn intent classification and slot extraction in an on-device smart home controller. Trained using knowledge distillation from a 120B teacher model, this model delivers 96.71% tool call accuracy while running locally on-device for private, low-latency smart home control.
For the safetensors version of the Qwen3-based model, see distil-labs/distil-home-assistant-qwen3.
Results
| Model | Tool Call Accuracy | ROUGE |
|---|---|---|
| GPT-oss-120B (teacher) | 92.11% | 98.53% |
| FunctionGemma (base) | 38.82% | 74.32% |
| This model (tuned) | 96.71% | 99.32% |
The fine-tuned model exceeds the 120B teacher on tool call accuracy. Fine-tuning is essential for reliable multi-turn smart home conversations.
Quick Start
Using Transformers
import json
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("distil-labs/distil-home-assistant-functiongemma")
tokenizer = AutoTokenizer.from_pretrained("distil-labs/distil-home-assistant-functiongemma")
TOOLS = [
{"type": "function", "function": {"name": "toggle_lights", "description": "Turn lights on or off in a specified room", "parameters": {"type": "object", "properties": {"room": {"type": "string", "enum": ["living_room", "bedroom", "kitchen", "bathroom", "office", "hallway"]}, "state": {"type": "string", "enum": ["on", "off"]}}, "required": [], "additionalProperties": False}}},
{"type": "function", "function": {"name": "set_thermostat", "description": "Set the temperature for heating or cooling", "parameters": {"type": "object", "properties": {"temperature": {"type": "integer", "minimum": 60, "maximum": 80}, "mode": {"type": "string", "enum": ["heat", "cool", "auto"]}}, "required": [], "additionalProperties": False}}},
{"type": "function", "function": {"name": "lock_door", "description": "Lock or unlock a door", "parameters": {"type": "object", "properties": {"door": {"type": "string", "enum": ["front", "back", "garage", "side"]}, "state": {"type": "string", "enum": ["lock", "unlock"]}}, "required": [], "additionalProperties": False}}},
{"type": "function", "function": {"name": "get_device_status", "description": "Query the current state of a device or room", "parameters": {"type": "object", "properties": {"device_type": {"type": "string", "enum": ["lights", "thermostat", "door", "all"]}, "room": {"type": "string"}}, "required": [], "additionalProperties": False}}},
{"type": "function", "function": {"name": "set_scene", "description": "Activate a predefined scene", "parameters": {"type": "object", "properties": {"scene": {"type": "string", "enum": ["movie_night", "bedtime", "morning", "away", "party"]}}, "required": [], "additionalProperties": False}}},
{"type": "function", "function": {"name": "intent_unclear", "description": "Use when the user's intent cannot be determined", "parameters": {"type": "object", "properties": {"reason": {"type": "string", "enum": ["ambiguous", "off_topic", "incomplete", "unsupported_device"]}}, "required": [], "additionalProperties": False}}},
]
messages = [
{"role": "system", "content": "You are a tool-calling model working on:\n<task_description>You are an on-device smart home controller. Given a natural language command from the user, call the appropriate smart home function. If the user does not specify a required value (e.g. which room or what temperature), omit that parameter from the function call. Maintain context across conversation turns to resolve pronouns and sequential commands.</task_description>\n\nRespond to the conversation history by generating an appropriate tool call that satisfies the user request. Generate only the tool call according to the provided tool schema, do not generate anything else. Always respond with a tool call.\n\n"},
{"role": "user", "content": "Turn off the living room lights"},
]
text = tokenizer.apply_chat_template(
messages, tools=TOOLS, tokenize=False, add_generation_prompt=True,
)
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=256, temperature=0)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:], skip_special_tokens=True))
# <tool_call>
# {"name": "toggle_lights", "arguments": {"room": "living_room", "state": "off"}}
# </tool_call>
Using Ollama
huggingface-cli download distil-labs/distil-home-assistant-functiongemma --local-dir distil-model
cd distil-model
ollama create distil-home-assistant-functiongemma -f Modelfile
ollama run distil-home-assistant-functiongemma
Using with the Demo App
This model powers the Smart Home Controller demo — a text-based orchestrator that pairs an SLM with deterministic dialogue management for smart home control.
Model Details
| Property | Value |
|---|---|
| Base Model | FunctionGemma (Gemma3) |
| Architecture | Gemma3ForCausalLM |
| Context Length | 32,768 tokens |
| Precision | bfloat16 |
| Training Data | 50 seed conversations, synthetically expanded |
| Teacher Model | GPT-oss-120B |
| Task | Multi-turn tool calling (closed book) |
Training
This model was trained using the Distil Labs platform:
- Seed Data: 50 hand-written multi-turn conversations covering 6 smart home functions with 2-5 user turns per conversation
- Synthetic Expansion: Expanded to thousands of examples using a 120B teacher model
- Fine-tuning: Multi-turn tool calling distillation on FunctionGemma (Gemma3)
What the Model Does
The model acts as a function caller for a smart home controller. Given a user command and conversation history, it outputs a structured tool call:
User: "Turn off the living room lights"
Model: {"name": "toggle_lights", "arguments": {"room": "living_room", "state": "off"}}
User: "The kitchen too"
Model: {"name": "toggle_lights", "arguments": {"room": "kitchen", "state": "off"}}
User: "Lock the front door"
Model: {"name": "lock_door", "arguments": {"door": "front", "state": "lock"}}
User: "Can you order me a pizza?"
Model: {"name": "intent_unclear", "arguments": {"reason": "off_topic"}}
Supported Functions
The model handles 6 smart home operations:
| Function | Description |
|---|---|
toggle_lights |
Turn lights on or off in a room |
set_thermostat |
Set temperature and heating/cooling mode |
lock_door |
Lock or unlock a door |
get_device_status |
Query device state |
set_scene |
Activate a predefined scene |
intent_unclear |
Cannot determine intent |
Use Cases
- On-device smart home controllers with privacy-first design
- Text-based smart home chatbots with structured intent routing
- Edge deployment for local smart home hubs
- Any multi-turn tool calling task with bounded intent taxonomy
Limitations
- Trained on English smart home intents only
- Covers 6 specific smart home functions — not a general-purpose tool caller
- Temperature range is fixed to 60-80°F
License
This model is released under the Apache 2.0 license.
Links
Citation
@misc{distil-home-assistant-functiongemma,
author = {Distil Labs},
title = {Distil-Home-Assistant-FunctionGemma: A Fine-tuned SLM for Smart Home Control},
year = {2025},
publisher = {Hugging Face},
url = {https://huggingface.co/distil-labs/distil-home-assistant-functiongemma}
}