Training in progress epoch 0
Browse files- README.md +6 -8
- config.json +41 -1
- tf_model.h5 +2 -2
README.md
CHANGED
|
@@ -15,10 +15,10 @@ probably proofread and complete it, then remove this comment. -->
|
|
| 15 |
|
| 16 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
|
| 17 |
It achieves the following results on the evaluation set:
|
| 18 |
-
- Train Loss:
|
| 19 |
-
- Validation Loss: 3.
|
| 20 |
-
- Train Accuracy: 0.
|
| 21 |
-
- Epoch:
|
| 22 |
|
| 23 |
## Model description
|
| 24 |
|
|
@@ -37,16 +37,14 @@ More information needed
|
|
| 37 |
### Training hyperparameters
|
| 38 |
|
| 39 |
The following hyperparameters were used during training:
|
| 40 |
-
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps':
|
| 41 |
- training_precision: float32
|
| 42 |
|
| 43 |
### Training results
|
| 44 |
|
| 45 |
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|
| 46 |
|:----------:|:---------------:|:--------------:|:-----:|
|
| 47 |
-
| 3.
|
| 48 |
-
| 2.9958 | 3.0164 | 0.0 | 1 |
|
| 49 |
-
| 2.9949 | 3.0166 | 0.0 | 2 |
|
| 50 |
|
| 51 |
|
| 52 |
### Framework versions
|
|
|
|
| 15 |
|
| 16 |
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
|
| 17 |
It achieves the following results on the evaluation set:
|
| 18 |
+
- Train Loss: 3.6955
|
| 19 |
+
- Validation Loss: 3.6968
|
| 20 |
+
- Train Accuracy: 0.0111
|
| 21 |
+
- Epoch: 0
|
| 22 |
|
| 23 |
## Model description
|
| 24 |
|
|
|
|
| 37 |
### Training hyperparameters
|
| 38 |
|
| 39 |
The following hyperparameters were used during training:
|
| 40 |
+
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 450, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
|
| 41 |
- training_precision: float32
|
| 42 |
|
| 43 |
### Training results
|
| 44 |
|
| 45 |
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|
| 46 |
|:----------:|:---------------:|:--------------:|:-----:|
|
| 47 |
+
| 3.6955 | 3.6968 | 0.0111 | 0 |
|
|
|
|
|
|
|
| 48 |
|
| 49 |
|
| 50 |
### Framework versions
|
config.json
CHANGED
|
@@ -28,28 +28,68 @@
|
|
| 28 |
"16": "February",
|
| 29 |
"17": "pencil ",
|
| 30 |
"18": "proof sign 1",
|
| 31 |
-
"19": "secret "
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
},
|
| 33 |
"initializer_range": 0.02,
|
| 34 |
"label2id": {
|
| 35 |
"4 four": 14,
|
|
|
|
|
|
|
| 36 |
"February": 16,
|
| 37 |
"Huawei Mobile": 15,
|
| 38 |
"India": 0,
|
|
|
|
| 39 |
"Shiv Sena": 1,
|
|
|
|
|
|
|
| 40 |
"affiliate": 4,
|
|
|
|
|
|
|
| 41 |
"dartboard": 12,
|
|
|
|
|
|
|
| 42 |
"excuse": 10,
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
"knife ": 2,
|
| 44 |
"level sign 1": 7,
|
| 45 |
"live in relationship": 6,
|
|
|
|
| 46 |
"musician ": 3,
|
|
|
|
|
|
|
| 47 |
"nurse ": 9,
|
|
|
|
|
|
|
| 48 |
"pencil ": 17,
|
| 49 |
"proof sign 1": 18,
|
| 50 |
"raise ": 11,
|
| 51 |
"secret ": 19,
|
|
|
|
| 52 |
"swing ": 8,
|
|
|
|
| 53 |
"update news, information": 13,
|
| 54 |
"washroom": 5
|
| 55 |
},
|
|
|
|
| 28 |
"16": "February",
|
| 29 |
"17": "pencil ",
|
| 30 |
"18": "proof sign 1",
|
| 31 |
+
"19": "secret ",
|
| 32 |
+
"20": "distracted",
|
| 33 |
+
"21": "bookcase, bookshelf",
|
| 34 |
+
"22": "one day cricket, 50 overs format sign and explanation",
|
| 35 |
+
"23": "greedy ",
|
| 36 |
+
"24": "Allahabad Bank",
|
| 37 |
+
"25": "form sign 1",
|
| 38 |
+
"26": "market",
|
| 39 |
+
"27": "Bengali",
|
| 40 |
+
"28": "strength",
|
| 41 |
+
"29": "truck ",
|
| 42 |
+
"30": "day",
|
| 43 |
+
"31": "Usury",
|
| 44 |
+
"32": "beetroot",
|
| 45 |
+
"33": "fry, frying",
|
| 46 |
+
"34": "accept",
|
| 47 |
+
"35": "network",
|
| 48 |
+
"36": "paste",
|
| 49 |
+
"37": "identity card ",
|
| 50 |
+
"38": "none, nothing ",
|
| 51 |
+
"39": "Peru"
|
| 52 |
},
|
| 53 |
"initializer_range": 0.02,
|
| 54 |
"label2id": {
|
| 55 |
"4 four": 14,
|
| 56 |
+
"Allahabad Bank": 24,
|
| 57 |
+
"Bengali": 27,
|
| 58 |
"February": 16,
|
| 59 |
"Huawei Mobile": 15,
|
| 60 |
"India": 0,
|
| 61 |
+
"Peru": 39,
|
| 62 |
"Shiv Sena": 1,
|
| 63 |
+
"Usury": 31,
|
| 64 |
+
"accept": 34,
|
| 65 |
"affiliate": 4,
|
| 66 |
+
"beetroot": 32,
|
| 67 |
+
"bookcase, bookshelf": 21,
|
| 68 |
"dartboard": 12,
|
| 69 |
+
"day": 30,
|
| 70 |
+
"distracted": 20,
|
| 71 |
"excuse": 10,
|
| 72 |
+
"form sign 1": 25,
|
| 73 |
+
"fry, frying": 33,
|
| 74 |
+
"greedy ": 23,
|
| 75 |
+
"identity card ": 37,
|
| 76 |
"knife ": 2,
|
| 77 |
"level sign 1": 7,
|
| 78 |
"live in relationship": 6,
|
| 79 |
+
"market": 26,
|
| 80 |
"musician ": 3,
|
| 81 |
+
"network": 35,
|
| 82 |
+
"none, nothing ": 38,
|
| 83 |
"nurse ": 9,
|
| 84 |
+
"one day cricket, 50 overs format sign and explanation": 22,
|
| 85 |
+
"paste": 36,
|
| 86 |
"pencil ": 17,
|
| 87 |
"proof sign 1": 18,
|
| 88 |
"raise ": 11,
|
| 89 |
"secret ": 19,
|
| 90 |
+
"strength": 28,
|
| 91 |
"swing ": 8,
|
| 92 |
+
"truck ": 29,
|
| 93 |
"update news, information": 13,
|
| 94 |
"washroom": 5
|
| 95 |
},
|
tf_model.h5
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:85cd92aae227004bed151925df841e91f547a6d6f924990fa4561973be7d309b
|
| 3 |
+
size 268071880
|