File size: 5,126 Bytes
8f5312a
 
 
 
 
a981970
8f5312a
 
 
132748d
ebd341c
 
8f5312a
 
51f37d7
8f5312a
 
 
 
58c72ef
 
 
 
8f5312a
 
5117256
 
8f5312a
 
 
 
 
 
 
f02c8c5
 
cd6399d
5117256
 
c33f706
 
 
0d719c5
 
268ec13
0d719c5
 
5117256
71a966f
268ec13
 
 
71a966f
0d719c5
 
5117256
268ec13
0d719c5
268ec13
5117256
 
 
 
 
 
268ec13
71a966f
268ec13
 
0d719c5
 
 
97cd1ee
268ec13
 
 
 
 
 
 
 
 
 
 
5117256
 
 
 
 
 
 
75951a8
5117256
eb3f0d9
5117256
 
 
 
 
20ef12b
5117256
75951a8
5117256
 
 
 
 
 
 
20ef12b
 
 
 
 
 
 
 
 
 
 
8f5312a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
---
language:
- en
license: cc-by-nc-4.0
base_model:
- google/flan-t5-large
---

# EPlus-LLM

**Natural Language Interface for Automated Building Energy Modeling via LLMs**  
*A prototype project exploring the use of fine-tuned large language models to automate building energy modeling from natural language input.*

<div align="center">
  <img src="https://huggingface.co/EPlus-LLM/EPlus-LLMv1/resolve/main/EPlus-LLM_graphic.png" alt="Illustration of EPlus-LLMv2 for Auto-building energy modeling" width="700"/>
</div>


## ๐ŸŽ‰ News
- โšก๏ธ [2025/01/01]: A prompting-based method for auto-building energy modeling has been released.
[Paper here](https://doi.org/10.1016/j.energy.2025.134548).
- ๐Ÿ”ฅ [2024/05/016]: We first successfully implement natural language-based auto-building modeling by fine-tuning a large language model (LLM).
[Paper here](https://doi.org/10.1016/j.apenergy.2024.123431).

## ๐Ÿš€ Key Features
- Scalability: Auto-generates EnergyPlus models, including varying geometry sizes and internal loads.
- Accuracy & Efficiency: Achieves 100% modeling accuracy while reducing manual modeling time by over 95%.
- Interaction & Automation: A user-friendly human-AI interface for seamless model creation and customization.

## ๐Ÿ—๏ธ Target Users
This current platform is designed for engineers, architects, and researchers working in building performance, sustainability, and resilience. It is especially useful during early-stage conceptual design when modeling decisions have the greatest impact.

## ๐Ÿš€ Quick Start

Here provides a code snippet to show you how to load the EPlus-LLM and auto-generate building energy models.  
  
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Gangjiang1/EPlus-LLM/blob/main/v1/EPlus-LLM_inference.ipynb)

```python
# โš ๏ธ Please make sure you have adequate GPU memory.
# โš ๏ธ Please make sure your EnergyPlus version is 9.6 for successful running.

import torch
from transformers import (
    AutoModelForSeq2SeqLM, 
    AutoTokenizer,
)

# Load the rest port of IDF file.
file_path = "v1_nextpart.idf"
output_path = "v1_final.idf"

# Load the EPlus-LLM model
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-large")
model = AutoModelForSeq2SeqLM.from_pretrained("EPlus-LLM/EPlus-LLMv1")

# Generation config
generation_config = model.generation_config
generation_config.max_new_tokens = 2000
generation_config.temperature = 0.1
generation_config.top_p = 0.1
generation_config.num_return_sequences = 1
generation_config.pad_token_id = tokenizer.eos_token_id
generation_config.eos_token_id = tokenizer.eos_token_id

# Please provide your input here โ€” a description of the desired building
# For more details, please refer to the paper: https://doi.org/10.1016/j.apenergy.2024.123431
input="Simulate a building that is 30.00 meters long, 15.00 meters wide, and 3.50 meters high. The window-to-wall ratio is 0.28. The occupancy rate is 8.00 m2/people, the lighting level is 6.00 W/m2, and the equipment power consumption is 8.80 W/m2."
input_ids = tokenizer(input, return_tensors="pt", truncation=False)
generated_ids = model.generate(input_ids = input_ids.input_ids,
                           attention_mask = input_ids.attention_mask,
                           generation_config = generation_config)
generated_output = tokenizer.decode(generated_ids[0], skip_special_tokens=True)
generated_output = generated_output.replace("_", " ")
generated_output = generated_output.replace("|", "\n")

with open(file_path, 'r', encoding='utf-8') as file:
    nextpart = file.read()
final_text = nextpart + "\n\n" + generated_output
with open(output_path, 'w', encoding='utf-8') as f:
    f.write(final_text)
    
# Output the building energy model in IDF file
print(f"Building Energy Model Auto-Generated: {output_path}")
```

## ๐Ÿ“ Citation

If you find our work helpful, feel free to give us a cite.

```
@article{jiang2025EPlus-LLM,
  author    = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
  title     = {EPlus-LLM: A large language model-based computing platform for automated building energy modeling},
  journal   = {Applied Energy},
  volume    = {367},
  pages     = {123431},
  year      = {2024},
  month     = {Aug},
  doi       = {https://doi.org/10.1016/j.apenergy.2024.123431}}

@article{jiang2025prompting,
  author    = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
  title     = {Prompt engineering to inform large language models in automated building energy modeling},
  journal   = {Energy},
  volume    = {316},
  pages     = {134548},
  year      = {2025},
  month     = {Feb},
  doi       = {https://doi.org/10.1016/j.energy.2025.134548}}

@article{jiang2025EPlus-LLMv2,
  author    = {Gang Jiang and Jianli Chen},
  title     = {Efficient fine-tuning of large language models for automated building energy modeling in complex cases},
  journal   = {Automation in Construction},
  volume    = {175},
  pages     = {106223},
  year      = {2025},
  month     = {July},
  doi       = {https://doi.org/10.1016/j.autcon.2025.106223}}
```