File size: 7,547 Bytes
8f5312a
 
 
 
 
a981970
8f5312a
 
 
132748d
4556ffd
 
cd0c8e3
4556ffd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f08404e
4556ffd
dcc1783
4556ffd
 
a5c0b37
f98ddd0
4556ffd
0f384ba
 
8feb9ac
4556ffd
 
 
 
6fceb34
4556ffd
 
 
ebd341c
 
8f5312a
 
51f37d7
8f5312a
 
 
 
58c72ef
 
 
 
8f5312a
 
5117256
 
8f5312a
 
 
 
 
 
 
f02c8c5
 
cd6399d
5117256
 
cf3a58e
c33f706
9174264
0d719c5
 
268ec13
0d719c5
 
5117256
71a966f
0d719c5
add1fd6
 
 
5117256
268ec13
0d719c5
268ec13
5117256
 
 
 
 
 
268ec13
71a966f
268ec13
 
0d719c5
 
 
97cd1ee
268ec13
 
 
786baea
 
 
268ec13
 
 
 
 
 
 
 
5117256
 
 
 
 
 
 
75951a8
5117256
eb3f0d9
5117256
 
 
 
 
20ef12b
5117256
75951a8
5117256
 
 
 
 
 
 
20ef12b
 
 
 
 
 
 
 
 
 
 
8f5312a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
---
language:
- en
license: cc-by-nc-4.0
base_model:
- google/flan-t5-large
---

# EPlus-LLM

<!-- Logo 居中显示 -->
<div align="center">
  <img src="https://huggingface.co/EPlus-LLM/EPlus-LLMv1/resolve/main/v1_platform_logo.png?raw=true" width="80%" alt="EPlus-LLM v2" />
</div>

<hr>

<!-- Badge 样式美化 + 自适应布局 -->
<style>
  .badge-container {
    display: flex;
    flex-wrap: wrap;
    justify-content: center;
    align-items: center;
    gap: 6px;
    margin-top: 10px;
    margin-bottom: 10px;
  }
  .badge-container a img {
    height: 28px;
    transition: transform 0.2s ease;
  }
  .badge-container a:hover img {
    transform: scale(1.05);
  }
  @media (max-width: 500px) {
    .badge-container a img {
      height: 24px;
    }
  }
</style>

<!-- 徽章容器 -->
<div class="badge-container">
  <a href="https://huggingface.co/EPlus-LLM" target="_blank">
  <img alt="Hugging Face" src="https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-EPlus--LLM-ffc107?color=ffc107&logoColor=white"/>
  </a>
  <a href="https://colab.research.google.com/github/Gangjiang1/EPlus-LLM/blob/main/v1/EPlus-LLM_inference.ipynb" target="_blank">
    <img alt="Open In Colab" src="https://colab.research.google.com/assets/colab-badge.svg"/>
  </a>
  <a href="https://www.linkedin.com/in/gang-jiang-46b990273" target="_blank" style="margin: 2px;">
    <img alt="LinkedIn" src="https://img.shields.io/badge/🤖LinkedIn-Connect-0A66C2?style=flat&logo=linkedin&logoColor=white" style="display: inline-block; vertical-align: middle;"/>
  </a>
  <a href="https://www.sciencedirect.com/science/article/pii/S0306261924008146" target="_blank">
    <img alt="Paper" src="https://img.shields.io/badge/Paper-EPlus--LLM-red?logo=arxiv&logoColor=white"/>
  </a>
  <a href="https://huggingface.co/EPlus-LLM/EPlus-LLMv2/resolve/main/figs/qr.png?raw=true" target="_blank">
    <img alt="WeChat" src="https://img.shields.io/badge/WeChat-Gang%20Jiang-brightgreen?logo=wechat&logoColor=white"/>
  </a>
  <a href="LICENSE" target="_blank">
  <img alt="License" src="https://img.shields.io/badge/License-Apache%202.0-blue.svg?logo=apache&logoColor=f5de53" style="display: inline-block; vertical-align: middle;"/>
  </a>
</div>

**Natural Language Interface for Automated Building Energy Modeling via LLMs**  
*A prototype project exploring the use of fine-tuned large language models to automate building energy modeling from natural language input.*

<div align="center">
  <img src="https://huggingface.co/EPlus-LLM/EPlus-LLMv1/resolve/main/EPlus-LLM_graphic.png" alt="Illustration of EPlus-LLMv2 for Auto-building energy modeling" width="700"/>
</div>


## 🎉 News
- ⚡️ [2025/01/01]: A prompting-based method for auto-building energy modeling has been released.
[Paper here](https://doi.org/10.1016/j.energy.2025.134548).
- 🔥 [2024/05/016]: We first successfully implement natural language-based auto-building modeling by fine-tuning a large language model (LLM).
[Paper here](https://doi.org/10.1016/j.apenergy.2024.123431).

## 🚀 Key Features
- Scalability: Auto-generates EnergyPlus models, including varying geometry sizes and internal loads.
- Accuracy & Efficiency: Achieves 100% modeling accuracy while reducing manual modeling time by over 95%.
- Interaction & Automation: A user-friendly human-AI interface for seamless model creation and customization.

## 🏗️ Target Users
This current platform is designed for engineers, architects, and researchers working in building performance, sustainability, and resilience. It is especially useful during early-stage conceptual design when modeling decisions have the greatest impact.

## 🚀 Quick Start

Here provides a code snippet to show you how to load the EPlus-LLM and auto-generate building energy models.  
  
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Gangjiang1/EPlus-LLM/blob/main/v1/EPlus-LLM_inference.ipynb)

```python
# ⚠️ Please make sure you have GPU.
# ⚠️ Please make sure your EnergyPlus version is 9.6 for successful running.
# ⚠️ Download the v1_nextpart.idf file from the EPlus-LLM repo and place it in your current working directory.
import torch
from transformers import (
    AutoModelForSeq2SeqLM, 
    AutoTokenizer,
)

# Load the EPlus-LLM model
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-large")
model = AutoModelForSeq2SeqLM.from_pretrained("EPlus-LLM/EPlus-LLMv1"
                                              # , force_download=True # If you cannot download the model
                                              )

# Generation config
generation_config = model.generation_config
generation_config.max_new_tokens = 2000
generation_config.temperature = 0.1
generation_config.top_p = 0.1
generation_config.num_return_sequences = 1
generation_config.pad_token_id = tokenizer.eos_token_id
generation_config.eos_token_id = tokenizer.eos_token_id

# Please provide your input here — a description of the desired building
# For more details, please refer to the paper: https://doi.org/10.1016/j.apenergy.2024.123431
input="Simulate a building that is 30.00 meters long, 15.00 meters wide, and 3.50 meters high. The window-to-wall ratio is 0.28. The occupancy rate is 8.00 m2/people, the lighting level is 6.00 W/m2, and the equipment power consumption is 8.80 W/m2."
input_ids = tokenizer(input, return_tensors="pt", truncation=False)
generated_ids = model.generate(input_ids = input_ids.input_ids,
                           attention_mask = input_ids.attention_mask,
                           generation_config = generation_config)
generated_output = tokenizer.decode(generated_ids[0], skip_special_tokens=True)
generated_output = generated_output.replace("_", " ")
generated_output = generated_output.replace("|", "\n")

# Load the rest port of IDF file.
file_path = "v1_nextpart.idf" # File is in the repo, please download.
output_path = "v1_final.idf"
with open(file_path, 'r', encoding='utf-8') as file:
    nextpart = file.read()
final_text = nextpart + "\n\n" + generated_output
with open(output_path, 'w', encoding='utf-8') as f:
    f.write(final_text)
    
# Output the building energy model in IDF file
print(f"Building Energy Model Auto-Generated: {output_path}")
```

## 📝 Citation

If you find our work helpful, feel free to give us a cite.

```
@article{jiang2025EPlus-LLM,
  author    = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
  title     = {EPlus-LLM: A large language model-based computing platform for automated building energy modeling},
  journal   = {Applied Energy},
  volume    = {367},
  pages     = {123431},
  year      = {2024},
  month     = {Aug},
  doi       = {https://doi.org/10.1016/j.apenergy.2024.123431}}

@article{jiang2025prompting,
  author    = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
  title     = {Prompt engineering to inform large language models in automated building energy modeling},
  journal   = {Energy},
  volume    = {316},
  pages     = {134548},
  year      = {2025},
  month     = {Feb},
  doi       = {https://doi.org/10.1016/j.energy.2025.134548}}

@article{jiang2025EPlus-LLMv2,
  author    = {Gang Jiang and Jianli Chen},
  title     = {Efficient fine-tuning of large language models for automated building energy modeling in complex cases},
  journal   = {Automation in Construction},
  volume    = {175},
  pages     = {106223},
  year      = {2025},
  month     = {July},
  doi       = {https://doi.org/10.1016/j.autcon.2025.106223}}
```