Mascarade PlatformIO

Fine-tuned TinyLlama-1.1B-Chat model specialized in PlatformIO embedded development workflows.

Part of the Mascarade ecosystem — an agentic LLM orchestration system with domain-specific fine-tuned models for embedded systems and electronics.

Training details

Parameter Value
Base model TinyLlama/TinyLlama-1.1B-Chat-v1.0
Method LoRA (PEFT) — merged into full weights
LoRA rank (r) 16
LoRA alpha 32
LoRA dropout 0.05
Target modules q_proj, k_proj, v_proj, o_proj
Epochs 2
Training steps 20
Dataset ShareGPT format, domain-specific PlatformIO examples
GPU Quadro P2000 (5 GB VRAM)
Framework Hugging Face Transformers + PEFT

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("electron-rare/mascarade-platformio")
tokenizer = AutoTokenizer.from_pretrained("electron-rare/mascarade-platformio")

messages = [{"role": "user", "content": "How do I configure platformio.ini for an STM32 board with custom upload protocol?"}]
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt")
outputs = model.generate(inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Related models

Model Domain Base
mascarade-iot IoT general Qwen2.5-Coder-1.5B
mascarade-esp32 ESP32 microcontrollers TinyLlama-1.1B
mascarade-spice SPICE circuit simulation TinyLlama-1.1B

Datasets

All training datasets are available under clemsail on Hugging Face.

🇪🇺 EU AI Act transparency

This adapter is provided as a fine-tuned LoRA under the AI Act framework (Regulation EU 2024/1689). Compliance metadata:

Field Value
Provider L'Électron Rare (clemsail / electron-rare)
Role under AI Act GPAI provider for this adapter
Base model TinyLlama/TinyLlama-1.1B-Chat-v1.0 — see upstream provenance
Adapter type LoRA / PEFT — adapter weights only; base unchanged
Training data origin L'Électron Rare proprietary technical corpus + curated public docs
License Apache-2.0 (adapter). Upstream base licence applies separately.
Intended use PlatformIO build system
Out of scope Healthcare diagnosis, legal advice, autonomous safety-critical decisions, generation of malicious code
Risk classification Limited risk — Article 50 transparency obligations apply
Copyright respect Training data does not include scraped copyrighted material. Opt-out signals (robots.txt, ai.txt) are honoured for web-sourced data.
Full provenance https://github.com/L-electron-Rare/eu-kiki/tree/main/docs/provenance
Contact postmaster@saillant.cc — biased output reports, copyright concerns, etc.

⚠️ You are using an AI model. Outputs may be inaccurate, biased or fabricated. Do not act on them without independent verification, especially in regulated domains.

Downloads last month
18
Safetensors
Model size
1B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for electron-rare/mascarade-platformio

Adapter
(1489)
this model