selbainu24 commited on
Commit
99286c6
·
1 Parent(s): e22236d

Upload model

Browse files
Files changed (2) hide show
  1. README.md +24 -0
  2. adapter_model.safetensors +3 -0
README.md CHANGED
@@ -202,6 +202,28 @@ The following `bitsandbytes` quantization config was used during training:
202
  - bnb_4bit_use_double_quant: False
203
  - bnb_4bit_compute_dtype: bfloat16
204
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
205
  The following `bitsandbytes` quantization config was used during training:
206
  - load_in_8bit: False
207
  - load_in_4bit: True
@@ -235,5 +257,7 @@ The following `bitsandbytes` quantization config was used during training:
235
  - PEFT 0.4.0
236
  - PEFT 0.4.0
237
  - PEFT 0.4.0
 
 
238
 
239
  - PEFT 0.4.0
 
202
  - bnb_4bit_use_double_quant: False
203
  - bnb_4bit_compute_dtype: bfloat16
204
 
205
+ The following `bitsandbytes` quantization config was used during training:
206
+ - load_in_8bit: False
207
+ - load_in_4bit: True
208
+ - llm_int8_threshold: 6.0
209
+ - llm_int8_skip_modules: None
210
+ - llm_int8_enable_fp32_cpu_offload: False
211
+ - llm_int8_has_fp16_weight: False
212
+ - bnb_4bit_quant_type: nf4
213
+ - bnb_4bit_use_double_quant: False
214
+ - bnb_4bit_compute_dtype: bfloat16
215
+
216
+ The following `bitsandbytes` quantization config was used during training:
217
+ - load_in_8bit: False
218
+ - load_in_4bit: True
219
+ - llm_int8_threshold: 6.0
220
+ - llm_int8_skip_modules: None
221
+ - llm_int8_enable_fp32_cpu_offload: False
222
+ - llm_int8_has_fp16_weight: False
223
+ - bnb_4bit_quant_type: nf4
224
+ - bnb_4bit_use_double_quant: False
225
+ - bnb_4bit_compute_dtype: bfloat16
226
+
227
  The following `bitsandbytes` quantization config was used during training:
228
  - load_in_8bit: False
229
  - load_in_4bit: True
 
257
  - PEFT 0.4.0
258
  - PEFT 0.4.0
259
  - PEFT 0.4.0
260
+ - PEFT 0.4.0
261
+ - PEFT 0.4.0
262
 
263
  - PEFT 0.4.0
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ee5b7cf71ac4841ca53d9bebe2b21adb26cb14286d4c3ddeb5b2f09d14e6ea86
3
+ size 16794200