Update README.md
Browse files
README.md
CHANGED
|
@@ -21,26 +21,63 @@ A prototype project exploring the use of fine-tuned large language models to aut
|
|
| 21 |
[Paper here](https://doi.org/10.1016/j.apenergy.2024.123431).
|
| 22 |
|
| 23 |
## π Key Features
|
| 24 |
-
- Scalability: Auto-generates
|
| 25 |
-
- Accuracy & Efficiency: Achieves 100% modeling accuracy while reducing manual modeling time by over
|
| 26 |
- Interaction & Automation: A user-friendly human-AI interface for seamless model creation and customization.
|
| 27 |
|
| 28 |
-
- Flexible Design Scenarios:
|
| 29 |
-
|
| 30 |
-
β
Geometry: square, L-, T-, U-, and hollow-square-shaped buildings
|
| 31 |
-
β
Roof types: flat, gable, hip β customizable attic/ridge height
|
| 32 |
-
β
Orientation & windows: custom WWR, window placement, facade-specific controls
|
| 33 |
-
β
Walls & materials: thermal properties, insulation types
|
| 34 |
-
β
Internal loads: lighting, equipment, occupancy, infiltration/ventilation, schedules, heating/cooling setpoints
|
| 35 |
-
β
Thermal zoning: configurable multi-zone layouts with core & perimeter zones
|
| 36 |
-
|
| 37 |
## ποΈ Target Users
|
| 38 |
This current platform is designed for engineers, architects, and researchers working in building performance, sustainability, and resilience. It is especially useful during early-stage conceptual design when modeling decisions have the greatest impact.
|
| 39 |
|
| 40 |
## π Quick Start
|
| 41 |
|
| 42 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 44 |
|
| 45 |
π Repository Structure
|
| 46 |
|
|
@@ -67,4 +104,32 @@ This repository contains v2 and v1 of EPlus-LLM, along with implementation detai
|
|
| 67 |
```
|
| 68 |
cd v2
|
| 69 |
python EPlus-LLM/v2/Inference.py
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 70 |
```
|
|
|
|
| 21 |
[Paper here](https://doi.org/10.1016/j.apenergy.2024.123431).
|
| 22 |
|
| 23 |
## π Key Features
|
| 24 |
+
- Scalability: Auto-generates EnergyPlus models, including varying geometry sizes and internal loads.
|
| 25 |
+
- Accuracy & Efficiency: Achieves 100% modeling accuracy while reducing manual modeling time by over 95%.
|
| 26 |
- Interaction & Automation: A user-friendly human-AI interface for seamless model creation and customization.
|
| 27 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 28 |
## ποΈ Target Users
|
| 29 |
This current platform is designed for engineers, architects, and researchers working in building performance, sustainability, and resilience. It is especially useful during early-stage conceptual design when modeling decisions have the greatest impact.
|
| 30 |
|
| 31 |
## π Quick Start
|
| 32 |
|
| 33 |
+
Here provides a code snippet to show you how to load the EPlus-LLM and auto-generate building energy models.
|
| 34 |
+
|
| 35 |
+
```python
|
| 36 |
+
|
| 37 |
+
|
| 38 |
+
generation_config = model.generation_config
|
| 39 |
+
|
| 40 |
+
generation_config.max_new_tokens = 1300
|
| 41 |
+
generation_config.temperature = 0.1
|
| 42 |
+
generation_config.top_p = 0.1
|
| 43 |
+
generation_config.num_return_sequences = 1
|
| 44 |
+
generation_config.pad_token_id = tokenizer.eos_token_id
|
| 45 |
+
generation_config.eos_token_id = tokenizer.eos_token_id
|
| 46 |
+
|
| 47 |
+
|
| 48 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 49 |
+
|
| 50 |
+
model_name = "Qwen/Qwen2.5-32B-Instruct"
|
| 51 |
|
| 52 |
+
model = AutoModelForCausalLM.from_pretrained(
|
| 53 |
+
model_name,
|
| 54 |
+
torch_dtype="auto",
|
| 55 |
+
device_map="auto"
|
| 56 |
+
)
|
| 57 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
| 58 |
+
|
| 59 |
+
prompt = "Give me a short introduction to large language model."
|
| 60 |
+
messages = [
|
| 61 |
+
{"role": "system", "content": "You are Qwen, created by Alibaba Cloud. You are a helpful assistant."},
|
| 62 |
+
{"role": "user", "content": prompt}
|
| 63 |
+
]
|
| 64 |
+
text = tokenizer.apply_chat_template(
|
| 65 |
+
messages,
|
| 66 |
+
tokenize=False,
|
| 67 |
+
add_generation_prompt=True
|
| 68 |
+
)
|
| 69 |
+
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
|
| 70 |
+
|
| 71 |
+
generated_ids = model.generate(
|
| 72 |
+
**model_inputs,
|
| 73 |
+
max_new_tokens=512
|
| 74 |
+
)
|
| 75 |
+
generated_ids = [
|
| 76 |
+
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
|
| 77 |
+
]
|
| 78 |
+
|
| 79 |
+
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
|
| 80 |
+
```
|
| 81 |
|
| 82 |
π Repository Structure
|
| 83 |
|
|
|
|
| 104 |
```
|
| 105 |
cd v2
|
| 106 |
python EPlus-LLM/v2/Inference.py
|
| 107 |
+
```
|
| 108 |
+
|
| 109 |
+
## π Citation
|
| 110 |
+
|
| 111 |
+
If you find our work helpful, feel free to give us a cite.
|
| 112 |
+
|
| 113 |
+
```
|
| 114 |
+
@article{jiang2025prompt,
|
| 115 |
+
author = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
|
| 116 |
+
title = {Prompt engineering to inform large language models in automated building energy modeling},
|
| 117 |
+
journal = {Applied Energy},
|
| 118 |
+
volume = {367},
|
| 119 |
+
pages = {123431},
|
| 120 |
+
year = {2024},
|
| 121 |
+
month = {Aug},
|
| 122 |
+
doi = {https://doi.org/10.1016/j.apenergy.2024.123431}
|
| 123 |
+
}
|
| 124 |
+
|
| 125 |
+
@article{jiang2025prompt,
|
| 126 |
+
author = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
|
| 127 |
+
title = {Prompt engineering to inform large language models in automated building energy modeling},
|
| 128 |
+
journal = {Energy},
|
| 129 |
+
volume = {316},
|
| 130 |
+
pages = {134548},
|
| 131 |
+
year = {2025},
|
| 132 |
+
month = {Feb},
|
| 133 |
+
doi = {https://doi.org/10.1016/j.energy.2025.134548}
|
| 134 |
+
}
|
| 135 |
```
|