datasets:
- lamm-mit/Bioinspired3D
language:
- en
base_model:
- meta-llama/Llama-3.2-3B-Instruct
Bioinspired3D
Fine-tuned version of meta-llama/Llama-3.2-3B-Instruct using LoRA adapters for Blender code generation for bioinspired 3D models.
Abstract
Generative AI has made rapid progress in text, image, and video synthesis, yet text-to-3D modeling for scientific design remains particularly challenging due to limited controllability and high computational cost. Most existing 3D generative methods rely on meshes, voxels, or point clouds which can be costly to train and difficult to control. We introduce Bioinspired123D, a lightweight and modular code- as-geometry pipeline that generates fabricable 3D structures directly through parametric programs rather than dense visual representations. At the core of Bioinspired123D is Bioinspired3D, a compact language model finetuned to translate natural language design cues into Blender Python scripts encoding smooth, biologically inspired geometries. We curate a domain-specific dataset of over 4,000 bioinspired and geometric design scripts spanning helical, cellular, and tubular motifs with parametric variability. The dataset is expanded and validated through an automated LLM-driven, Blender-based quality control pipeline. Bioinspired3D is then embedded in a graph-based agentic framework that integrates multimodal retrieval-augmented generation and a vision–language model critic to iteratively evaluate, critique, and repair generated scripts. We evaluate performance on a new benchmark for 3D geometry script generation and show that Bioinspired123D demonstrates a near fourfold improvement over its unfinetuned base model, while also outperforming substantially larger state-of-the-art language models despite using far fewer parameters and compute. By prioritizing code-as-geometry representations, Bioinspired123D enables compute-efficient, controllable, and interpretable text-to-3D generation, lowering barriers to AI driven scientific discovery in materials and structural design.
What’s in this repo (Hugging Face)
This Hugging Face release contains Bioinspired3D only: a LoRA adapter that you load on top of the base model to generate Blender Python scripts from natural-language prompts.
For the full Bioinspired123D agentic framework (retrieval + VLM critic + iterative repair), see the GitHub repo: https://github.com/lamm-mit/Bioinspired123D
Usage
Install
pip install -U transformers accelerate peft torch
Load the base model + LoRA adapter
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
import torch
BASE_MODEL = "meta-llama/Llama-3.2-3B-Instruct"
LORA_ADAPTER = "rachelkluu/bioinspired3D"
# Set this to your preferred device, e.g. "cuda:0" or "cpu"
DEVICE_3D = "cuda:0"
bio3d_tok = AutoTokenizer.from_pretrained(BASE_MODEL)
base_model = AutoModelForCausalLM.from_pretrained(
BASE_MODEL,
torch_dtype=torch.float16,
device_map={"": DEVICE_3D},
)
bio3d_model = PeftModel.from_pretrained(base_model, LORA_ADAPTER)
bio3d_model.eval()
def format_input(prompt: str) -> str:
return (
"<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n"
"You are a helpful assistant<|eot_id|>"
"<|start_header_id|>user<|end_header_id|>\n\n"
f"{prompt}<|eot_id|>"
"<|start_header_id|>assistant<|end_header_id|>\n\n"
)
Load utility functions
def extract_blender_code(model_out: str) -> str:
matches = list(re.finditer(r"```python\s*(.*?)```", model_out, flags=re.DOTALL))
if matches:
return matches[-1].group(1).strip()
pos = model_out.rfind("import bpy")
return model_out[pos:].strip() if pos != -1 else model_out.strip()
def clean_blender_code(text: str) -> str:
if not text:
return "import bpy"
code = text.strip()
code = code.replace("```python", "").replace("```", "")
code = re.sub(r"[\x00-\x08\x0b-\x1f]", "", code)
if not code.lstrip().startswith("import bpy"):
code = "import bpy\n" + code
return code
Generate Blender code from a natural-language prompt
prompt = """Write Blender code to make a cellular structure."""
formatted = format_input(prompt)
inputs = bio3d_tok(formatted, return_tensors="pt").to(bio3d_model.device)
with torch.no_grad():
outputs = bio3d_model.generate(
**inputs,
max_new_tokens=2048,
do_sample=True,
temperature=0.1,
top_p=0.9,
)
raw = bio3d_tok.decode(outputs[0], skip_special_tokens=True)
raw_code = extract_blender_code(raw)
blender_code = clean_blender_code(raw_code)
print(blender_code)
Prompting tips
Input: Natural language design intent (for example: “tubular structure with noisy placement”, “helical material with cylindrical fibers”, “smoothed cellular structure”).
Output: A Blender Python script (intended to be executed in Blender) that constructs the requested geometry.
To encourage explicit reasoning, append a variant of: “Think step by step.” to the end of your prompt. For example: "Write Blender code to make a tubular structure with z-aligned tubules. Think step by step."
Notes:
This adapter is meant to be used with the specified base model. Generated scripts should be treated like code: run in a sandboxed environment and validate geometry as needed.
Citation
If you use Bioinspired3D or the broader Bioinspired123D framework in your work, please cite:
@article{luu2026bioinspired123d,
title={Bioinspired123D: Generative 3D Modeling System for Bioinspired Structures},
author={Luu, Rachel K. and Buehler, Markus J.},
year={2026}
}