File size: 1,667 Bytes
0658353
 
eb375b9
 
 
 
 
 
0658353
eb375b9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---

license: mit
datasets:
- sxiong/SWAP_v2
language:
- en
base_model:
- meta-llama/Meta-Llama-3-8B-Instruct
---


# **Model Card for SWAP_LLM_v2**

**SWAP_LLM_v2** is a suite of supervised fine-tuned models developed for **multi-step reasoning** with large language models (LLMs).
The framework encompasses two primary components: **generator** and **discriminator**.


## **Model Details**

### **Generator**

* **Base Model:** `meta-llama/Meta-Llama-3-8B-Instruct`
* **LoRA Configuration:**

  * `lora_alpha`: 32
  * `r`: 16
  * `target_modules`: `["up_proj", "down_proj", "gate_proj", "q_proj","k_proj", "v_proj", "o_proj"]`
  * `bias`: `"none"`


For additional information and implementation details, please refer to the [SWAP GitHub repository](https://github.com/xiongsiheng/SWAP).


## Citation
```

@inproceedings{xiong-etal-2025-deliberate,

    title = "Deliberate Reasoning in Language Models as Structure-Aware Planning with an Accurate World Model",

    author = "Xiong, Siheng  and

      Payani, Ali  and

      Yang, Yuan  and

      Fekri, Faramarz",

    editor = "Che, Wanxiang  and

      Nabende, Joyce  and

      Shutova, Ekaterina  and

      Pilehvar, Mohammad Taher",

    booktitle = "Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",

    month = jul,

    year = "2025",

    address = "Vienna, Austria",

    publisher = "Association for Computational Linguistics",

    url = "https://aclanthology.org/2025.acl-long.1540/",

    doi = "10.18653/v1/2025.acl-long.1540",

    pages = "31900--31931",

    ISBN = "979-8-89176-251-0"

}

```