kaytoo2022 commited on
Commit
9ec9dcb
·
1 Parent(s): a8f4ee4

Training in progress epoch 0

Browse files
Files changed (3) hide show
  1. README.md +4 -6
  2. tf_model.h5 +1 -1
  3. tokenizer.json +6 -1
README.md CHANGED
@@ -15,9 +15,9 @@ probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Train Loss: 2.3173
19
- - Validation Loss: 2.1511
20
- - Epoch: 2
21
 
22
  ## Model description
23
 
@@ -43,9 +43,7 @@ The following hyperparameters were used during training:
43
 
44
  | Train Loss | Validation Loss | Epoch |
45
  |:----------:|:---------------:|:-----:|
46
- | 2.4391 | 2.2175 | 0 |
47
- | 2.3682 | 2.1773 | 1 |
48
- | 2.3173 | 2.1511 | 2 |
49
 
50
 
51
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Train Loss: 3.2303
19
+ - Validation Loss: 2.6701
20
+ - Epoch: 0
21
 
22
  ## Model description
23
 
 
43
 
44
  | Train Loss | Validation Loss | Epoch |
45
  |:----------:|:---------------:|:-----:|
46
+ | 3.2303 | 2.6701 | 0 |
 
 
47
 
48
 
49
  ### Framework versions
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ce9674de6e4f5e40aa61d1fbc11d4d1cc12d7b0db9dbf547b214a62104e2bbf5
3
  size 1188285040
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2425e863be7437e9927d3aac2c94d04511ecdcc90c5bb243a6d8243053a91984
3
  size 1188285040
tokenizer.json CHANGED
@@ -1,6 +1,11 @@
1
  {
2
  "version": "1.0",
3
- "truncation": null,
 
 
 
 
 
4
  "padding": null,
5
  "added_tokens": [
6
  {
 
1
  {
2
  "version": "1.0",
3
+ "truncation": {
4
+ "direction": "Right",
5
+ "max_length": 128,
6
+ "strategy": "LongestFirst",
7
+ "stride": 0
8
+ },
9
  "padding": null,
10
  "added_tokens": [
11
  {