andstor commited on
Commit
014c50a
·
verified ·
1 Parent(s): 484e23e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +50 -59
README.md CHANGED
@@ -32,11 +32,28 @@ pretty_name: PEFT Unit Test Generation Experiments
32
  size_categories:
33
  - n<1K
34
  ---
35
- # Dataset Card for Dataset Name
36
 
37
- <!-- Provide a quick summary of the dataset. -->
38
 
39
- This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
 
41
  ## Dataset Details
42
 
@@ -138,7 +155,7 @@ This dataset card aims to be a base template for new datasets. It has been gener
138
  </table>
139
 
140
  #### Model-specific Hyperparameters
141
- <table >
142
  <thead>
143
  <tr>
144
  <th>Hyperparameter</th>
@@ -149,60 +166,34 @@ This dataset card aims to be a base template for new datasets. It has been gener
149
  </thead>
150
  <tbody>
151
  <tr>
152
- <td style="vertical-align: middle;">Targeted attention modules</td>
153
- <td style="vertical-align: middle;">LoRA, (IA)<sup>3</sup></td>
154
- <td>
155
- codegen-350M-multi<br>
156
- Salesforce/codegen2-1B_P<br>
157
- Salesforce/codegen2-3_7B_P<br>
158
- Salesforce/codegen2-7B_P<br>
159
- Salesforce/codegen2-16B_P<br>
160
- meta-llama/CodeLlama-7b-hf<br>
161
- bigcode/starcoderbase<br>
162
- bigcode/starcoder2-3b<br>
163
- bigcode/starcoder2-7b<br>
164
- bigcode/starcoder2-15b
165
- </td>
166
- <td>
167
- qkv_proj<br>
168
- qkv_proj<br>
169
- qkv_proj<br>
170
- qkv_proj<br>
171
- qkv_proj<br>
172
- q_proj, v_proj<br>
173
- c_attn<br>
174
- q_proj, v_proj<br>
175
- q_proj, v_proj<br>
176
- q_proj, v_proj
177
- </td>
178
- </tr>
179
- <tr>
180
- <td style="vertical-align: middle;">Targeted feedforward modules</td>
181
- <td style="vertical-align: middle;">(IA)<sup>3</sup></td>
182
- <td>
183
- codegen-350M-multi<br>
184
- Salesforce/codegen2-1B_P<br>
185
- Salesforce/codegen2-3_7B_P<br>
186
- Salesforce/codegen2-7B_P<br>
187
- Salesforce/codegen2-16B_P<br>
188
- meta-llama/CodeLlama-7b-hf<br>
189
- bigcode/starcoderbase<br>
190
- bigcode/starcoder2-3b<br>
191
- bigcode/starcoder2-7b<br>
192
- bigcode/starcoder2-15b
193
- </td>
194
- <td>
195
- fc_out<br>
196
- fc_out<br>
197
- fc_out<br>
198
- fc_out<br>
199
- fc_out<br>
200
- down_proj<br>
201
- mlp.c_proj<br>
202
- q_proj, c_proj<br>
203
- q_proj, c_proj<br>
204
- q_proj, c_proj
205
- </td>
206
- </tr>
207
  </tbody>
208
  </table>
 
32
  size_categories:
33
  - n<1K
34
  ---
35
+ # PEFT Unit Test Generation Experiments
36
 
37
+ ## Dataset description
38
 
39
+ The **PEFT Unit Test Generation Experiments** dataset contains metadata and details about a set of trained models used for generating unit tests with parameter-efficient fine-tuning (PEFT) methods. This dataset includes models from multiple namespaces and various sizes, trained with different tuning methods to provide a comprehensive resource for unit test generation research.
40
+
41
+ ## Dataset Structure
42
+
43
+ ### Data Fields
44
+
45
+ Each example in the dataset corresponds to a specific trained model variant and includes the following features:
46
+
47
+ | Feature Name | Description |
48
+ |-------------------|-----------------------------------------------------------------------------------------------------|
49
+ | `model_type` | The type or architecture of the base model (e.g., codegen, starcoder). |
50
+ | `namespace` | The organization or group that created or published the base model (e.g., Salesforce, meta-llama). |
51
+ | `model_name` | The specific name or identifier of the model. |
52
+ | `training_method` | The parameter-efficient fine-tuning method used for training (e.g., full fine-tuning, LoRA, IA³). |
53
+ | `model_size` | The size of the model, typically measured in number of parameters (e.g., 350M, 7B). |
54
+ | `trainable_params`| The number of trainable parameters for the specific tuning method applied, given the hyperparameters in [Training Hyperparameters](#training-hyperparameters) |
55
+ | `url` | A direct link to the model weights or relevant resource. |
56
+ | `doi` | The digital object identifier associated with the model. |
57
 
58
  ## Dataset Details
59
 
 
155
  </table>
156
 
157
  #### Model-specific Hyperparameters
158
+ <table>
159
  <thead>
160
  <tr>
161
  <th>Hyperparameter</th>
 
166
  </thead>
167
  <tbody>
168
  <tr>
169
+ <td rowspan="10" style="vertical-align: middle;">Targeted attention modules</td>
170
+ <td rowspan="10" style="vertical-align: middle;">LoRA, (IA)<sup>3</sup></td>
171
+ <td>codegen-350M-multi</td>
172
+ <td>qkv_proj</td>
173
+ </tr>
174
+ <tr><td>Salesforce/codegen2-1B_P</td><td>qkv_proj</td></tr>
175
+ <tr><td>Salesforce/codegen2-3_7B_P</td><td>qkv_proj</td></tr>
176
+ <tr><td>Salesforce/codegen2-7B_P</td><td>qkv_proj</td></tr>
177
+ <tr><td>Salesforce/codegen2-16B_P</td><td>qkv_proj</td></tr>
178
+ <tr><td>meta-llama/CodeLlama-7b-hf</td><td>q_proj, v_proj</td></tr>
179
+ <tr><td>bigcode/starcoderbase</td><td>c_attn</td></tr>
180
+ <tr><td>bigcode/starcoder2-3b</td><td>q_proj, v_proj</td></tr>
181
+ <tr><td>bigcode/starcoder2-7b</td><td>q_proj, v_proj</td></tr>
182
+ <tr><td>bigcode/starcoder2-15b</td><td>q_proj, v_proj</td></tr>
183
+ <tr>
184
+ <td rowspan="10" style="vertical-align: middle;">Targeted feedforward modules</td>
185
+ <td rowspan="10" style="vertical-align: middle;">(IA)<sup>3</sup></td>
186
+ <td>codegen-350M-multi</td>
187
+ <td>fc_out</td>
188
+ </tr>
189
+ <tr><td>Salesforce/codegen2-1B_P</td><td>fc_out</td></tr>
190
+ <tr><td>Salesforce/codegen2-3_7B_P</td><td>fc_out</td></tr>
191
+ <tr><td>Salesforce/codegen2-7B_P</td><td>fc_out</td></tr>
192
+ <tr><td>Salesforce/codegen2-16B_P</td><td>fc_out</td></tr>
193
+ <tr><td>meta-llama/CodeLlama-7b-hf</td><td>down_proj</td></tr>
194
+ <tr><td>bigcode/starcoderbase</td><td>mlp.c_proj</td></tr>
195
+ <tr><td>bigcode/starcoder2-3b</td><td>q_proj, c_proj</td></tr>
196
+ <tr><td>bigcode/starcoder2-7b</td><td>q_proj, c_proj</td></tr>
197
+ <tr><td>bigcode/starcoder2-15b</td><td>q_proj, c_proj</td></tr>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
198
  </tbody>
199
  </table>