| --- |
| language: |
| - en |
| license: cc-by-nc-4.0 |
| base_model: |
| - google/flan-t5-large |
| --- |
| |
| # EPlus-LLM |
|
|
| <!-- Logo 居中显示 --> |
| <div align="center"> |
| <img src="https://huggingface.co/EPlus-LLM/EPlus-LLMv1/resolve/main/v1_platform_logo.png?raw=true" width="80%" alt="EPlus-LLM v2" /> |
| </div> |
|
|
| <hr> |
|
|
| <!-- Badge 样式美化 + 自适应布局 --> |
| <style> |
| .badge-container { |
| display: flex; |
| flex-wrap: wrap; |
| justify-content: center; |
| align-items: center; |
| gap: 6px; |
| margin-top: 10px; |
| margin-bottom: 10px; |
| } |
| .badge-container a img { |
| height: 28px; |
| transition: transform 0.2s ease; |
| } |
| .badge-container a:hover img { |
| transform: scale(1.05); |
| } |
| @media (max-width: 500px) { |
| .badge-container a img { |
| height: 24px; |
| } |
| } |
| </style> |
| |
| <!-- 徽章容器 --> |
| <div class="badge-container"> |
| <a href="https://huggingface.co/EPlus-LLM" target="_blank"> |
| <img alt="Hugging Face" src="https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-EPlus--LLM-ffc107?color=ffc107&logoColor=white"/> |
| </a> |
| <a href="https://colab.research.google.com/github/Gangjiang1/EPlus-LLM/blob/main/v1/EPlus-LLM_inference.ipynb" target="_blank"> |
| <img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"/> |
| </a> |
| <a href="https://www.linkedin.com/in/gang-jiang-46b990273" target="_blank" style="margin: 2px;"> |
| <img alt="LinkedIn" src="https://img.shields.io/badge/🤖LinkedIn-Connect-0A66C2?style=flat&logo=linkedin&logoColor=white" style="display: inline-block; vertical-align: middle;"/> |
| </a> |
| <a href="https://www.sciencedirect.com/science/article/pii/S0306261924008146" target="_blank"> |
| <img alt="Paper" src="https://img.shields.io/badge/Paper-EPlus--LLM-red?logo=arxiv&logoColor=white"/> |
| </a> |
| <a href="https://huggingface.co/EPlus-LLM/EPlus-LLMv2/resolve/main/figs/qr.png?raw=true" target="_blank"> |
| <img alt="WeChat" src="https://img.shields.io/badge/WeChat-Gang%20Jiang-brightgreen?logo=wechat&logoColor=white"/> |
| </a> |
| <a href="LICENSE" target="_blank"> |
| <img alt="License" src="https://img.shields.io/badge/License-Apache%202.0-blue.svg?logo=apache&logoColor=f5de53" style="display: inline-block; vertical-align: middle;"/> |
| </a> |
| </div> |
| |
| **Natural Language Interface for Automated Building Energy Modeling via LLMs** |
| *A prototype project exploring the use of fine-tuned large language models to automate building energy modeling from natural language input.* |
|
|
| <div align="center"> |
| <img src="https://huggingface.co/EPlus-LLM/EPlus-LLMv1/resolve/main/EPlus-LLM_graphic.png" alt="Illustration of EPlus-LLMv2 for Auto-building energy modeling" width="700"/> |
| </div> |
|
|
|
|
| ## 🎉 News |
| - ⚡️ [2025/01/01]: A prompting-based method for auto-building energy modeling has been released. |
| [Paper here](https://doi.org/10.1016/j.energy.2025.134548). |
| - 🔥 [2024/05/016]: We first successfully implement natural language-based auto-building modeling by fine-tuning a large language model (LLM). |
| [Paper here](https://doi.org/10.1016/j.apenergy.2024.123431). |
|
|
| ## 🚀 Key Features |
| - Scalability: Auto-generates EnergyPlus models, including varying geometry sizes and internal loads. |
| - Accuracy & Efficiency: Achieves 100% modeling accuracy while reducing manual modeling time by over 95%. |
| - Interaction & Automation: A user-friendly human-AI interface for seamless model creation and customization. |
|
|
| ## 🏗️ Target Users |
| This current platform is designed for engineers, architects, and researchers working in building performance, sustainability, and resilience. It is especially useful during early-stage conceptual design when modeling decisions have the greatest impact. |
|
|
| ## 🚀 Quick Start |
|
|
| Here provides a code snippet to show you how to load the EPlus-LLM and auto-generate building energy models. |
| |
| [](https://colab.research.google.com/github/Gangjiang1/EPlus-LLM/blob/main/v1/EPlus-LLM_inference.ipynb) |
|
|
| ```python |
| # ⚠️ Please make sure you have GPU. |
| # ⚠️ Please make sure your EnergyPlus version is 9.6 for successful running. |
| # ⚠️ Download the v1_nextpart.idf file from the EPlus-LLM repo and place it in your current working directory. |
| import torch |
| from transformers import ( |
| AutoModelForSeq2SeqLM, |
| AutoTokenizer, |
| ) |
| |
| # Load the EPlus-LLM model |
| tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-large") |
| model = AutoModelForSeq2SeqLM.from_pretrained("EPlus-LLM/EPlus-LLMv1" |
| # , force_download=True # If you cannot download the model |
| ) |
| |
| # Generation config |
| generation_config = model.generation_config |
| generation_config.max_new_tokens = 2000 |
| generation_config.temperature = 0.1 |
| generation_config.top_p = 0.1 |
| generation_config.num_return_sequences = 1 |
| generation_config.pad_token_id = tokenizer.eos_token_id |
| generation_config.eos_token_id = tokenizer.eos_token_id |
| |
| # Please provide your input here — a description of the desired building |
| # For more details, please refer to the paper: https://doi.org/10.1016/j.apenergy.2024.123431 |
| input="Simulate a building that is 30.00 meters long, 15.00 meters wide, and 3.50 meters high. The window-to-wall ratio is 0.28. The occupancy rate is 8.00 m2/people, the lighting level is 6.00 W/m2, and the equipment power consumption is 8.80 W/m2." |
| input_ids = tokenizer(input, return_tensors="pt", truncation=False) |
| generated_ids = model.generate(input_ids = input_ids.input_ids, |
| attention_mask = input_ids.attention_mask, |
| generation_config = generation_config) |
| generated_output = tokenizer.decode(generated_ids[0], skip_special_tokens=True) |
| generated_output = generated_output.replace("_", " ") |
| generated_output = generated_output.replace("|", "\n") |
| |
| # Load the rest port of IDF file. |
| file_path = "v1_nextpart.idf" # File is in the repo, please download. |
| output_path = "v1_final.idf" |
| with open(file_path, 'r', encoding='utf-8') as file: |
| nextpart = file.read() |
| final_text = nextpart + "\n\n" + generated_output |
| with open(output_path, 'w', encoding='utf-8') as f: |
| f.write(final_text) |
| |
| # Output the building energy model in IDF file |
| print(f"Building Energy Model Auto-Generated: {output_path}") |
| ``` |
|
|
| ## 📝 Citation |
|
|
| If you find our work helpful, feel free to give us a cite. |
|
|
| ``` |
| @article{jiang2025EPlus-LLM, |
| author = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen}, |
| title = {EPlus-LLM: A large language model-based computing platform for automated building energy modeling}, |
| journal = {Applied Energy}, |
| volume = {367}, |
| pages = {123431}, |
| year = {2024}, |
| month = {Aug}, |
| doi = {https://doi.org/10.1016/j.apenergy.2024.123431}} |
| |
| @article{jiang2025prompting, |
| author = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen}, |
| title = {Prompt engineering to inform large language models in automated building energy modeling}, |
| journal = {Energy}, |
| volume = {316}, |
| pages = {134548}, |
| year = {2025}, |
| month = {Feb}, |
| doi = {https://doi.org/10.1016/j.energy.2025.134548}} |
| |
| @article{jiang2025EPlus-LLMv2, |
| author = {Gang Jiang and Jianli Chen}, |
| title = {Efficient fine-tuning of large language models for automated building energy modeling in complex cases}, |
| journal = {Automation in Construction}, |
| volume = {175}, |
| pages = {106223}, |
| year = {2025}, |
| month = {July}, |
| doi = {https://doi.org/10.1016/j.autcon.2025.106223}} |
| ``` |