hmm404/siamese1.0
Browse files
README.md
CHANGED
|
@@ -4,72 +4,74 @@ tags:
|
|
| 4 |
- sentence-similarity
|
| 5 |
- feature-extraction
|
| 6 |
- generated_from_trainer
|
| 7 |
-
- dataset_size:
|
| 8 |
-
- loss:
|
| 9 |
base_model: sentence-transformers/all-mpnet-base-v2
|
| 10 |
widget:
|
| 11 |
-
- source_sentence:
|
| 12 |
-
|
| 13 |
sentences:
|
| 14 |
-
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
- Most countries have an army, including civilized ones. Citizens in the army that
|
| 26 |
-
kill others in the line of army duties are not punished.
|
| 27 |
-
- source_sentence: Ants directly "milk" aphids for their honeydew), so while saying
|
| 28 |
-
no other animal drinks another's milk is true, other animals do actively drink
|
| 29 |
-
other animals secretions in a symbiotic way.
|
| 30 |
sentences:
|
| 31 |
-
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 39 |
sentences:
|
| 40 |
-
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
Christian is thus to not participate either.
|
| 48 |
sentences:
|
| 49 |
-
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 57 |
sentences:
|
| 58 |
-
-
|
| 59 |
-
|
| 60 |
-
|
| 61 |
-
-
|
| 62 |
-
|
| 63 |
-
|
| 64 |
-
- In no other area of medicine would it be acceptable to try and tout non-legitimate
|
| 65 |
-
treatments as viable options.
|
| 66 |
pipeline_tag: sentence-similarity
|
| 67 |
library_name: sentence-transformers
|
| 68 |
---
|
| 69 |
|
| 70 |
# SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
|
| 71 |
|
| 72 |
-
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2)
|
| 73 |
|
| 74 |
## Model Details
|
| 75 |
|
|
@@ -79,9 +81,7 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [s
|
|
| 79 |
- **Maximum Sequence Length:** 384 tokens
|
| 80 |
- **Output Dimensionality:** 768 dimensions
|
| 81 |
- **Similarity Function:** Cosine Similarity
|
| 82 |
-
- **Training
|
| 83 |
-
- train
|
| 84 |
-
- test
|
| 85 |
<!-- - **Language:** Unknown -->
|
| 86 |
<!-- - **License:** Unknown -->
|
| 87 |
|
|
@@ -119,9 +119,9 @@ from sentence_transformers import SentenceTransformer
|
|
| 119 |
model = SentenceTransformer("sentence_transformers_model_id")
|
| 120 |
# Run inference
|
| 121 |
sentences = [
|
| 122 |
-
"
|
| 123 |
-
'
|
| 124 |
-
'
|
| 125 |
]
|
| 126 |
embeddings = model.encode(sentences)
|
| 127 |
print(embeddings.shape)
|
|
@@ -171,53 +171,29 @@ You can finetune this model on your own dataset.
|
|
| 171 |
|
| 172 |
## Training Details
|
| 173 |
|
| 174 |
-
### Training
|
| 175 |
|
| 176 |
-
####
|
| 177 |
|
| 178 |
-
*
|
| 179 |
-
*
|
| 180 |
-
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
|
| 181 |
* Approximate statistics based on the first 1000 samples:
|
| 182 |
-
| |
|
| 183 |
-
|
| 184 |
-
| type | string | string |
|
| 185 |
-
| details | <ul><li>min: 3 tokens</li><li>mean:
|
| 186 |
* Samples:
|
| 187 |
-
|
|
| 188 |
-
|
| 189 |
-
| <code>
|
| 190 |
-
| <code>
|
| 191 |
-
| <code>
|
| 192 |
-
* Loss: [<code>
|
| 193 |
```json
|
| 194 |
{
|
| 195 |
-
"distance_metric": "
|
| 196 |
-
"
|
| 197 |
-
|
| 198 |
-
```
|
| 199 |
-
|
| 200 |
-
#### test
|
| 201 |
-
|
| 202 |
-
* Dataset: test
|
| 203 |
-
* Size: 6,471 training samples
|
| 204 |
-
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
|
| 205 |
-
* Approximate statistics based on the first 1000 samples:
|
| 206 |
-
| | anchor | positive | negative |
|
| 207 |
-
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 208 |
-
| type | string | string | string |
|
| 209 |
-
| details | <ul><li>min: 7 tokens</li><li>mean: 27.97 tokens</li><li>max: 116 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 34.26 tokens</li><li>max: 124 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 35.75 tokens</li><li>max: 134 tokens</li></ul> |
|
| 210 |
-
* Samples:
|
| 211 |
-
| anchor | positive | negative |
|
| 212 |
-
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
| 213 |
-
| <code>Theft only applies to instances involving the individual loss of property that then unjustly enriches another. Tax does not unjustly enrich any individual because it benefits \(and is necessary for\) society as a whole. Thus it is merely the just payment of a debt that all individuals owe to each other.</code> | <code>Taxation is required for the government to act in the economy, whether that be to provide infrastructure or essential services such as welfare.</code> | <code>Such a collective act can still meet the definition of theft; for instance if a government authorises its agents to take items from another country \(as in cases of historic looting\). Just because a collective benefits from an action does not necessarily justify that action.</code> |
|
| 214 |
-
| <code>Admitting more refugees can have adverse effects on host economies. US economic interests in Europe could subsequently be negatively impacted.</code> | <code>Admitting more refugees could increase the population of low skilled laborers. By the logic of the Stopler-Samuelson theorem, this would drive wages down for citizens of the host country.</code> | <code>Approximately half of the jobs available to workers in Europe can be automated mckinsey.com. Since technological advancement has greater potential for job displacement than refugee migration, policy makers should be wary of citing economic reasons against expanding refugee intake.</code> |
|
| 215 |
-
| <code>Private universities have historically been institutions of innovation in education, forcing them into an unfair marketplace would be a mistake.</code> | <code>Public universities are controlled by the state making politically controversial courses of study more difficult.</code> | <code>Plenty of innovation in education comes from public universities, they make up many of the examples of innovation cited in this report.</code> |
|
| 216 |
-
* Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters:
|
| 217 |
-
```json
|
| 218 |
-
{
|
| 219 |
-
"distance_metric": "TripletDistanceMetric.COSINE",
|
| 220 |
-
"triplet_margin": 0.3
|
| 221 |
}
|
| 222 |
```
|
| 223 |
|
|
@@ -347,30 +323,42 @@ You can finetune this model on your own dataset.
|
|
| 347 |
### Training Logs
|
| 348 |
| Epoch | Step | Training Loss |
|
| 349 |
|:------:|:-----:|:-------------:|
|
| 350 |
-
| 0.
|
| 351 |
-
| 0.
|
| 352 |
-
| 0.
|
| 353 |
-
| 0.
|
| 354 |
-
| 0.
|
| 355 |
-
| 0.
|
| 356 |
-
| 0.
|
| 357 |
-
| 0.
|
| 358 |
-
|
|
| 359 |
-
|
|
| 360 |
-
|
|
| 361 |
-
|
|
| 362 |
-
| 1.
|
| 363 |
-
| 1.
|
| 364 |
-
| 1.
|
| 365 |
-
| 1.
|
| 366 |
-
|
|
| 367 |
-
|
|
| 368 |
-
|
|
| 369 |
-
|
|
| 370 |
-
|
|
| 371 |
-
|
|
| 372 |
-
|
|
| 373 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 374 |
|
| 375 |
|
| 376 |
### Framework Versions
|
|
@@ -399,15 +387,17 @@ You can finetune this model on your own dataset.
|
|
| 399 |
}
|
| 400 |
```
|
| 401 |
|
| 402 |
-
####
|
| 403 |
```bibtex
|
| 404 |
-
@
|
| 405 |
-
|
| 406 |
-
|
| 407 |
-
|
| 408 |
-
|
| 409 |
-
|
| 410 |
-
|
|
|
|
|
|
|
| 411 |
}
|
| 412 |
```
|
| 413 |
|
|
|
|
| 4 |
- sentence-similarity
|
| 5 |
- feature-extraction
|
| 6 |
- generated_from_trainer
|
| 7 |
+
- dataset_size:49273
|
| 8 |
+
- loss:ContrastiveLoss
|
| 9 |
base_model: sentence-transformers/all-mpnet-base-v2
|
| 10 |
widget:
|
| 11 |
+
- source_sentence: Evil is separate from the creator and results from creations after
|
| 12 |
+
they are created
|
| 13 |
sentences:
|
| 14 |
+
- The Creation of an omniscient and omnipotent God, after it's created, remains
|
| 15 |
+
in "his" control by definition. The time dimension is part of the world \(God's
|
| 16 |
+
creation\), so the consequences to the creation as time goes by are part of the
|
| 17 |
+
creation itself. There can be no separation. If evil exists, it must be part of
|
| 18 |
+
such a God's plan.
|
| 19 |
+
- There aren't sufficient grounds for believing that the Bible contains divine revelation.
|
| 20 |
+
- Higher occurrence rate of informal or 'donkey' voting.
|
| 21 |
+
- source_sentence: Liquid democracy will slow down governments even more, as all citizens
|
| 22 |
+
have to ponder about each and every issue and on top, if they conclude that they
|
| 23 |
+
do not want to vote, then about all the many possible experts for each and every
|
| 24 |
+
issue.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 25 |
sentences:
|
| 26 |
+
- Association with someone like Comey doesn't make a strong case for bias. Comey
|
| 27 |
+
himself maintained his objectivity long beyond what most people would be capable
|
| 28 |
+
of, until he was directly dragged into a necessarily political position by the
|
| 29 |
+
President himself during the private talk.
|
| 30 |
+
- Looking at the partial participation, growth and continued existence of both small
|
| 31 |
+
niches and huge groups on Reddit, Twitter, Facebook and tons of others, verified
|
| 32 |
+
accounts and admins on these systems are probably the best examples of a delegate
|
| 33 |
+
system. We can conclude that people will participate in the parts that interest
|
| 34 |
+
them and delegate the rest, resulting in the desired speed-up of democracy.
|
| 35 |
+
- There is some chicken-or-egg here. One might argue a welfare state leads to less
|
| 36 |
+
poverty, but another would argue welfare states are a luxury only affordable by
|
| 37 |
+
wealthy countries.
|
| 38 |
+
- source_sentence: Renewable energy is a better option for replacing fossil fuels
|
| 39 |
+
than nuclear.
|
| 40 |
sentences:
|
| 41 |
+
- Pulling out supports the impression he frequently conveys to his base that he's
|
| 42 |
+
deadly serious about America First). They seem to approve.
|
| 43 |
+
- Renewable energy sources are cheaper than nuclear.
|
| 44 |
+
- Abolishing inheritance might lead to children not treating their parents as well
|
| 45 |
+
as they would have otherwise.
|
| 46 |
+
- source_sentence: Throughout the play, Claudius indicates that he does not fully
|
| 47 |
+
believe that Hamlet is mad.
|
|
|
|
| 48 |
sentences:
|
| 49 |
+
- Claudius admits that Hamlet's behaviour is more indicative of depression or melancholy
|
| 50 |
+
than madness "What he spake, though it lack'd form a little. Was not like madness."
|
| 51 |
+
\(Act 3, Scene i\)
|
| 52 |
+
- AI is a software.
|
| 53 |
+
- If people think that their car will kill them, it will not stop them from hacking
|
| 54 |
+
their car, even if it lowered their car's security. People tend to be less focused
|
| 55 |
+
on downstream consequences when dealing with immediate problems. Further, they
|
| 56 |
+
could justify it by saying that the probability that they would even be hacked
|
| 57 |
+
is incredibly low, and the marginal loss of security is worth it.
|
| 58 |
+
- source_sentence: As race is socially constructed, it means different things to different
|
| 59 |
+
people. As a result, there is no clear line which courts can use to differentiate
|
| 60 |
+
what is and isn't whitewashing.
|
| 61 |
sentences:
|
| 62 |
+
- For some, it would be whitewashing to have a lighter skinned black person portraying
|
| 63 |
+
a person who had a darker skin tone in real life. For others, this would not be
|
| 64 |
+
the case.
|
| 65 |
+
- Some goods may only be actualizable by allowing the possibility of evil.
|
| 66 |
+
- Energy for EVs has to be produced in often environmentally unfriendly power plants,
|
| 67 |
+
such as coal or gas.
|
|
|
|
|
|
|
| 68 |
pipeline_tag: sentence-similarity
|
| 69 |
library_name: sentence-transformers
|
| 70 |
---
|
| 71 |
|
| 72 |
# SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
|
| 73 |
|
| 74 |
+
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
| 75 |
|
| 76 |
## Model Details
|
| 77 |
|
|
|
|
| 81 |
- **Maximum Sequence Length:** 384 tokens
|
| 82 |
- **Output Dimensionality:** 768 dimensions
|
| 83 |
- **Similarity Function:** Cosine Similarity
|
| 84 |
+
<!-- - **Training Dataset:** Unknown -->
|
|
|
|
|
|
|
| 85 |
<!-- - **Language:** Unknown -->
|
| 86 |
<!-- - **License:** Unknown -->
|
| 87 |
|
|
|
|
| 119 |
model = SentenceTransformer("sentence_transformers_model_id")
|
| 120 |
# Run inference
|
| 121 |
sentences = [
|
| 122 |
+
"As race is socially constructed, it means different things to different people. As a result, there is no clear line which courts can use to differentiate what is and isn't whitewashing.",
|
| 123 |
+
'For some, it would be whitewashing to have a lighter skinned black person portraying a person who had a darker skin tone in real life. For others, this would not be the case.',
|
| 124 |
+
'Energy for EVs has to be produced in often environmentally unfriendly power plants, such as coal or gas.',
|
| 125 |
]
|
| 126 |
embeddings = model.encode(sentences)
|
| 127 |
print(embeddings.shape)
|
|
|
|
| 171 |
|
| 172 |
## Training Details
|
| 173 |
|
| 174 |
+
### Training Dataset
|
| 175 |
|
| 176 |
+
#### Unnamed Dataset
|
| 177 |
|
| 178 |
+
* Size: 49,273 training samples
|
| 179 |
+
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
|
|
|
|
| 180 |
* Approximate statistics based on the first 1000 samples:
|
| 181 |
+
| | sentence1 | sentence2 | label |
|
| 182 |
+
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------|
|
| 183 |
+
| type | string | string | int |
|
| 184 |
+
| details | <ul><li>min: 3 tokens</li><li>mean: 30.31 tokens</li><li>max: 126 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 35.61 tokens</li><li>max: 180 tokens</li></ul> | <ul><li>0: ~54.80%</li><li>1: ~45.20%</li></ul> |
|
| 185 |
* Samples:
|
| 186 |
+
| sentence1 | sentence2 | label |
|
| 187 |
+
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
|
| 188 |
+
| <code>Pope Francis has tried to encourage Catholic priests to give communion to some divorced and remarried couples, or to families where unmarried parents are cohabiting. These statements have contributed to division among Catholics.</code> | <code>Pope Francis has stated that the church should apologise for its "blessing of many weapons", specifically against women, children, the LGBT community, and the poor.</code> | <code>1</code> |
|
| 189 |
+
| <code>Large areas of Earth are either not economically viable or are outright inhospitable without expensive long term terraforming projects to render them viable for proper habitation. As the population grows, supporting said population becomes exponentially more expensive per square meter.</code> | <code>Most likely people will retreat inland instead of creating 'water worlds', because it is easier and populations are well-established there.</code> | <code>1</code> |
|
| 190 |
+
| <code>Environmental and social wellbeing are ultimately in the best interests of businesses. Given enough freedom, businesses don't need to compromise society or the environment to thrive. Preserving these things will help them in the long run.</code> | <code>Nobody starts a business intending it to fail. They all must utilize sustainable business models. Any damage caused would be avoided through the government allowing companies what they need, while encouraging socially beneficial business practices, and enforcing individual liberty in the case a business attempts to infringe upon it. That’s as opposed to simply punishing those who step outside some arbitrary boundary with taxes and regulations, which kill otherwise perfectly good businesses.</code> | <code>1</code> |
|
| 191 |
+
* Loss: [<code>ContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#contrastiveloss) with these parameters:
|
| 192 |
```json
|
| 193 |
{
|
| 194 |
+
"distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE",
|
| 195 |
+
"margin": 0.5,
|
| 196 |
+
"size_average": true
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 197 |
}
|
| 198 |
```
|
| 199 |
|
|
|
|
| 323 |
### Training Logs
|
| 324 |
| Epoch | Step | Training Loss |
|
| 325 |
|:------:|:-----:|:-------------:|
|
| 326 |
+
| 0.0812 | 500 | 0.0309 |
|
| 327 |
+
| 0.1623 | 1000 | 0.0281 |
|
| 328 |
+
| 0.2435 | 1500 | 0.0279 |
|
| 329 |
+
| 0.3247 | 2000 | 0.0272 |
|
| 330 |
+
| 0.4058 | 2500 | 0.0257 |
|
| 331 |
+
| 0.4870 | 3000 | 0.0272 |
|
| 332 |
+
| 0.5682 | 3500 | 0.0265 |
|
| 333 |
+
| 0.6494 | 4000 | 0.0261 |
|
| 334 |
+
| 0.7305 | 4500 | 0.0252 |
|
| 335 |
+
| 0.8117 | 5000 | 0.0255 |
|
| 336 |
+
| 0.8929 | 5500 | 0.0261 |
|
| 337 |
+
| 0.9740 | 6000 | 0.0254 |
|
| 338 |
+
| 1.0552 | 6500 | 0.023 |
|
| 339 |
+
| 1.1364 | 7000 | 0.0223 |
|
| 340 |
+
| 1.2175 | 7500 | 0.0219 |
|
| 341 |
+
| 1.2987 | 8000 | 0.0209 |
|
| 342 |
+
| 1.3799 | 8500 | 0.0213 |
|
| 343 |
+
| 1.4610 | 9000 | 0.0205 |
|
| 344 |
+
| 1.5422 | 9500 | 0.0209 |
|
| 345 |
+
| 1.6234 | 10000 | 0.0205 |
|
| 346 |
+
| 1.7045 | 10500 | 0.0205 |
|
| 347 |
+
| 1.7857 | 11000 | 0.0201 |
|
| 348 |
+
| 1.8669 | 11500 | 0.0203 |
|
| 349 |
+
| 1.9481 | 12000 | 0.0196 |
|
| 350 |
+
| 2.0292 | 12500 | 0.0171 |
|
| 351 |
+
| 2.1104 | 13000 | 0.0148 |
|
| 352 |
+
| 2.1916 | 13500 | 0.0139 |
|
| 353 |
+
| 2.2727 | 14000 | 0.0145 |
|
| 354 |
+
| 2.3539 | 14500 | 0.0138 |
|
| 355 |
+
| 2.4351 | 15000 | 0.0138 |
|
| 356 |
+
| 2.5162 | 15500 | 0.0144 |
|
| 357 |
+
| 2.5974 | 16000 | 0.0134 |
|
| 358 |
+
| 2.6786 | 16500 | 0.0138 |
|
| 359 |
+
| 2.7597 | 17000 | 0.014 |
|
| 360 |
+
| 2.8409 | 17500 | 0.0133 |
|
| 361 |
+
| 2.9221 | 18000 | 0.0136 |
|
| 362 |
|
| 363 |
|
| 364 |
### Framework Versions
|
|
|
|
| 387 |
}
|
| 388 |
```
|
| 389 |
|
| 390 |
+
#### ContrastiveLoss
|
| 391 |
```bibtex
|
| 392 |
+
@inproceedings{hadsell2006dimensionality,
|
| 393 |
+
author={Hadsell, R. and Chopra, S. and LeCun, Y.},
|
| 394 |
+
booktitle={2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)},
|
| 395 |
+
title={Dimensionality Reduction by Learning an Invariant Mapping},
|
| 396 |
+
year={2006},
|
| 397 |
+
volume={2},
|
| 398 |
+
number={},
|
| 399 |
+
pages={1735-1742},
|
| 400 |
+
doi={10.1109/CVPR.2006.100}
|
| 401 |
}
|
| 402 |
```
|
| 403 |
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 437967672
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:36e88c6340b3eb3a8fdb0db2c1aa25591bc89c86bd3ca79da7bff4b6b7bfea28
|
| 3 |
size 437967672
|
runs/Feb16_03-04-22_fd002087ebe1/events.out.tfevents.1739675063.fd002087ebe1.667.0
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:ab201d847dd13f6250fda40af30ceaddf79f49215a7d43974496d0259fad3ab6
|
| 3 |
+
size 12347
|
training_args.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 5560
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:20d502cd8247c5e637c7f4cf273aee02cb9cef7a21046455af267e6b7a831aa7
|
| 3 |
size 5560
|