Upload model
Browse files- README.md +32 -40
- pytorch_model.bin +1 -1
README.md
CHANGED
|
@@ -16,40 +16,32 @@ metrics:
|
|
| 16 |
- recall
|
| 17 |
- f1
|
| 18 |
widget:
|
| 19 |
-
- text:
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
of
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
- text:
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
"(uncertain date), and the "Tantrasara "by Krishnananda Agamavagisha (late 16th
|
| 37 |
-
century).
|
| 38 |
-
- text: A united opposition of fourteen political parties organized into the National
|
| 39 |
-
Opposition Union (Unión Nacional Oppositora, UNO) with the support of the United
|
| 40 |
-
States National Endowment for Democracy.
|
| 41 |
-
- text: Lockheed said the U.S. Navy may also buy an additional 340 trainer aircraft
|
| 42 |
-
to replace its T34C trainers made by the Beech Aircraft Corp. unit of Raytheon
|
| 43 |
-
Corp.
|
| 44 |
pipeline_tag: token-classification
|
| 45 |
co2_eq_emissions:
|
| 46 |
-
emissions: 67.
|
| 47 |
source: codecarbon
|
| 48 |
training_type: fine-tuning
|
| 49 |
on_cloud: false
|
| 50 |
cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
|
| 51 |
ram_total_size: 31.777088165283203
|
| 52 |
-
hours_used: 0.
|
| 53 |
hardware_used: 1 x NVIDIA GeForce RTX 3090
|
| 54 |
base_model: prajjwal1/bert-small
|
| 55 |
model-index:
|
|
@@ -65,13 +57,13 @@ model-index:
|
|
| 65 |
split: test
|
| 66 |
metrics:
|
| 67 |
- type: f1
|
| 68 |
-
value: 0.
|
| 69 |
name: F1
|
| 70 |
- type: precision
|
| 71 |
-
value: 0.
|
| 72 |
name: Precision
|
| 73 |
- type: recall
|
| 74 |
-
value: 0.
|
| 75 |
name: Recall
|
| 76 |
---
|
| 77 |
|
|
@@ -105,8 +97,8 @@ This is a [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) model trained
|
|
| 105 |
### Metrics
|
| 106 |
| Label | Precision | Recall | F1 |
|
| 107 |
|:--------|:----------|:-------|:-------|
|
| 108 |
-
| **all** | 0.
|
| 109 |
-
| ORG | 0.
|
| 110 |
|
| 111 |
## Uses
|
| 112 |
|
|
@@ -118,7 +110,7 @@ from span_marker import SpanMarkerModel
|
|
| 118 |
# Download from the 🤗 Hub
|
| 119 |
model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-bert-small-orgs")
|
| 120 |
# Run inference
|
| 121 |
-
entities = model.predict("
|
| 122 |
```
|
| 123 |
|
| 124 |
### Downstream Use
|
|
@@ -173,7 +165,7 @@ trainer.save_model("tomaarsen/span-marker-bert-small-orgs-finetuned")
|
|
| 173 |
| Entities per sentence | 0 | 0.7865 | 39 |
|
| 174 |
|
| 175 |
### Training Hyperparameters
|
| 176 |
-
- learning_rate:
|
| 177 |
- train_batch_size: 128
|
| 178 |
- eval_batch_size: 128
|
| 179 |
- seed: 42
|
|
@@ -185,16 +177,16 @@ trainer.save_model("tomaarsen/span-marker-bert-small-orgs-finetuned")
|
|
| 185 |
### Training Results
|
| 186 |
| Epoch | Step | Validation Loss | Validation Precision | Validation Recall | Validation F1 | Validation Accuracy |
|
| 187 |
|:------:|:----:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-------------------:|
|
| 188 |
-
| 0.5720 | 600 | 0.
|
| 189 |
-
| 1.1439 | 1200 | 0.
|
| 190 |
-
| 1.7159 | 1800 | 0.
|
| 191 |
-
| 2.2879 | 2400 | 0.
|
| 192 |
-
| 2.8599 | 3000 | 0.
|
| 193 |
|
| 194 |
### Environmental Impact
|
| 195 |
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
|
| 196 |
- **Carbon Emitted**: 0.068 kg of CO2
|
| 197 |
-
- **Hours Used**: 0.
|
| 198 |
|
| 199 |
### Training Hardware
|
| 200 |
- **On Cloud**: No
|
|
|
|
| 16 |
- recall
|
| 17 |
- f1
|
| 18 |
widget:
|
| 19 |
+
- text: In 2005, Shankel signed with Warner Chappell Music and while pursuing his
|
| 20 |
+
own projects created another joint venture, Shankel Songs and signed Ben Glover,
|
| 21 |
+
"Billboard "'s Christian writer of the Year, 2010, Joy Williams of The Civil Wars,
|
| 22 |
+
and, whom he also produced.
|
| 23 |
+
- text: In 2002, Rodríguez moved to Mississippi and to the NASA Stennis Space Center
|
| 24 |
+
as the Director of Center Operations and as a member of the Senior Executive Service
|
| 25 |
+
where he managed facility construction, security and other programs for 4,500
|
| 26 |
+
Stennis personnel.
|
| 27 |
+
- text: American Motors included Chinese officials as part of the negotiations establishing
|
| 28 |
+
Beijing Jeep (now Beijing Benz).
|
| 29 |
+
- text: La Señora () is a popular Spanish television period drama series set in the
|
| 30 |
+
1920s, produced by Diagonal TV for Televisión Española that was broadcast on La
|
| 31 |
+
1 of Televisión Española from 2008 to 2010.
|
| 32 |
+
- text: 'Not only did the Hungarian Ministry of Foreign Affairs approve Radio Free
|
| 33 |
+
Europe''s new location, but the Ministry of Telecommunications did something even
|
| 34 |
+
more amazing: "They found us four phone lines in central Budapest," says Geza
|
| 35 |
+
Szocs, a Radio Free Europe correspondent who helped organize the Budapest location.'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 36 |
pipeline_tag: token-classification
|
| 37 |
co2_eq_emissions:
|
| 38 |
+
emissions: 67.93561835707102
|
| 39 |
source: codecarbon
|
| 40 |
training_type: fine-tuning
|
| 41 |
on_cloud: false
|
| 42 |
cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
|
| 43 |
ram_total_size: 31.777088165283203
|
| 44 |
+
hours_used: 0.52
|
| 45 |
hardware_used: 1 x NVIDIA GeForce RTX 3090
|
| 46 |
base_model: prajjwal1/bert-small
|
| 47 |
model-index:
|
|
|
|
| 57 |
split: test
|
| 58 |
metrics:
|
| 59 |
- type: f1
|
| 60 |
+
value: 0.7547025470254703
|
| 61 |
name: F1
|
| 62 |
- type: precision
|
| 63 |
+
value: 0.7617641715116279
|
| 64 |
name: Precision
|
| 65 |
- type: recall
|
| 66 |
+
value: 0.7477706438380596
|
| 67 |
name: Recall
|
| 68 |
---
|
| 69 |
|
|
|
|
| 97 |
### Metrics
|
| 98 |
| Label | Precision | Recall | F1 |
|
| 99 |
|:--------|:----------|:-------|:-------|
|
| 100 |
+
| **all** | 0.7618 | 0.7478 | 0.7547 |
|
| 101 |
+
| ORG | 0.7618 | 0.7478 | 0.7547 |
|
| 102 |
|
| 103 |
## Uses
|
| 104 |
|
|
|
|
| 110 |
# Download from the 🤗 Hub
|
| 111 |
model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-bert-small-orgs")
|
| 112 |
# Run inference
|
| 113 |
+
entities = model.predict("American Motors included Chinese officials as part of the negotiations establishing Beijing Jeep (now Beijing Benz).")
|
| 114 |
```
|
| 115 |
|
| 116 |
### Downstream Use
|
|
|
|
| 165 |
| Entities per sentence | 0 | 0.7865 | 39 |
|
| 166 |
|
| 167 |
### Training Hyperparameters
|
| 168 |
+
- learning_rate: 0.0001
|
| 169 |
- train_batch_size: 128
|
| 170 |
- eval_batch_size: 128
|
| 171 |
- seed: 42
|
|
|
|
| 177 |
### Training Results
|
| 178 |
| Epoch | Step | Validation Loss | Validation Precision | Validation Recall | Validation F1 | Validation Accuracy |
|
| 179 |
|:------:|:----:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-------------------:|
|
| 180 |
+
| 0.5720 | 600 | 0.0076 | 0.7642 | 0.6630 | 0.7100 | 0.9656 |
|
| 181 |
+
| 1.1439 | 1200 | 0.0070 | 0.7705 | 0.7139 | 0.7411 | 0.9699 |
|
| 182 |
+
| 1.7159 | 1800 | 0.0067 | 0.7837 | 0.7231 | 0.7522 | 0.9709 |
|
| 183 |
+
| 2.2879 | 2400 | 0.0070 | 0.7768 | 0.7517 | 0.7640 | 0.9725 |
|
| 184 |
+
| 2.8599 | 3000 | 0.0068 | 0.7877 | 0.7374 | 0.7617 | 0.9718 |
|
| 185 |
|
| 186 |
### Environmental Impact
|
| 187 |
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
|
| 188 |
- **Carbon Emitted**: 0.068 kg of CO2
|
| 189 |
+
- **Hours Used**: 0.52 hours
|
| 190 |
|
| 191 |
### Training Hardware
|
| 192 |
- **On Cloud**: No
|
pytorch_model.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 115096015
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b4daa10ec601a3d8f804bad331c8c9b0b90846ea0e8bc44779c1b3405b163306
|
| 3 |
size 115096015
|