sedrickkeh commited on
Commit
72e286f
·
verified ·
1 Parent(s): 0f4d375

Model save

Browse files
README.md CHANGED
@@ -1,10 +1,9 @@
1
  ---
2
  library_name: transformers
3
- license: llama3
4
- base_model: meta-llama/Meta-Llama-3-8B
5
  tags:
6
  - llama-factory
7
- - full
8
  - generated_from_trainer
9
  model-index:
10
  - name: OH_original_wo_airoboros
@@ -16,9 +15,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # OH_original_wo_airoboros
18
 
19
- This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) on the mlfoundations-dev/OH_original_wo_airoboros dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.6117
22
 
23
  ## Model description
24
 
@@ -56,9 +55,9 @@ The following hyperparameters were used during training:
56
 
57
  | Training Loss | Epoch | Step | Validation Loss |
58
  |:-------------:|:------:|:----:|:---------------:|
59
- | 0.6059 | 0.9977 | 323 | 0.6115 |
60
- | 0.5543 | 1.9985 | 647 | 0.6036 |
61
- | 0.5012 | 2.9931 | 969 | 0.6117 |
62
 
63
 
64
  ### Framework versions
 
1
  ---
2
  library_name: transformers
3
+ license: llama3.1
4
+ base_model: meta-llama/Llama-3.1-8B
5
  tags:
6
  - llama-factory
 
7
  - generated_from_trainer
8
  model-index:
9
  - name: OH_original_wo_airoboros
 
15
 
16
  # OH_original_wo_airoboros
17
 
18
+ This model is a fine-tuned version of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.6065
21
 
22
  ## Model description
23
 
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss |
57
  |:-------------:|:------:|:----:|:---------------:|
58
+ | 0.604 | 0.9977 | 323 | 0.6096 |
59
+ | 0.5577 | 1.9985 | 647 | 0.6017 |
60
+ | 0.5096 | 2.9931 | 969 | 0.6065 |
61
 
62
 
63
  ### Framework versions
generation_config.json CHANGED
@@ -1,8 +1,8 @@
1
  {
 
2
  "bos_token_id": 128000,
3
  "do_sample": true,
4
  "eos_token_id": 128001,
5
- "max_length": 4096,
6
  "temperature": 0.6,
7
  "top_p": 0.9,
8
  "transformers_version": "4.45.2"
 
1
  {
2
+ "_from_model_config": true,
3
  "bos_token_id": 128000,
4
  "do_sample": true,
5
  "eos_token_id": 128001,
 
6
  "temperature": 0.6,
7
  "top_p": 0.9,
8
  "transformers_version": "4.45.2"
model-00001-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f0b7fa4b3c25ac7fa66bac93061bb806c7618ea505690ed94f0ffab3a1efa01e
3
  size 4976698672
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ed94dc883cedbecc5325b66518ebef3054326ce33d80030f1eb5efc2fd8fbb14
3
  size 4976698672
model-00002-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1e884f05c095ad7062f8d2b8986a91ec67c47b95b4b2ad2d90ad33a568a3ffef
3
  size 4999802720
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:84df5837519e305f0e1ac09b83b49a4e1193a9e913228118d7cc018eed36bcc0
3
  size 4999802720
model-00003-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:db33c24d105b2c608e6843c2d1b2fd99ecb1f9544fb8516c5074041aaf8beaf2
3
  size 4915916176
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:01596a4b72dd7b3308d10c2cb062aab13068e57aa1b588158b10ebea95262981
3
  size 4915916176
model-00004-of-00004.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0cbc7430f63815ac63e0716f209a025e202f4d902ae1337c422ded96cdedb029
3
  size 1168138808
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:38884f78f3166d4d15206485f42ac587d9fc6cf3252c079ee0bb41b135264426
3
  size 1168138808
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ddbf9bcf83f94a500c98a0572385cb2d0e74084c7d1815e5478395d5dd429d2b
3
  size 7160
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f33c6f8dc7327ac28f65872ce166d3aa009527bb1ee114ccadc892a2a0601893
3
  size 7160