Rename README.md to Glossa-BART_v2_model_card.md
Browse files
README.md → Glossa-BART_v2_model_card.md
RENAMED
|
@@ -22,7 +22,7 @@ metrics:
|
|
| 22 |
|
| 23 |
# BART_SIGN2ENG_refinetuned
|
| 24 |
|
| 25 |
-
**`rrrr66254/
|
| 26 |
|
| 27 |
|
| 28 |
|
|
@@ -36,7 +36,7 @@ This model takes **sign language gloss** input and outputs **natural English sen
|
|
| 36 |
- **Developed by:** Dongjun Kim
|
| 37 |
- **Model type:** Text2Text Generation, Gloss2Eng
|
| 38 |
- **Language(s) (NLP):** English
|
| 39 |
-
- **Finetuned from model:** rrrr66254/
|
| 40 |
|
| 41 |
## Uses
|
| 42 |
|
|
@@ -171,7 +171,7 @@ The following automatic evaluation metrics were used to assess output fluency, a
|
|
| 171 |
The model was evaluated on a held-out validation set using BLEU, ROUGE, and BERTScore.
|
| 172 |
Below is a comparison between the newly fine-tuned model and the previous version:
|
| 173 |
|
| 174 |
-
| Metric | Previous Model (`
|
| 175 |
|-------------------|---------------------------------------------|-----------------------------|
|
| 176 |
| **BLEU-1** | 0.7063 | 0.7258 |
|
| 177 |
| **BLEU-2** | 0.6175 | 0.6552 |
|
|
@@ -257,17 +257,17 @@ Model was trained using Google Colab:
|
|
| 257 |
If you use this model in your work, please cite the Hugging Face model page:
|
| 258 |
|
| 259 |
```bibtex
|
| 260 |
-
@misc{
|
| 261 |
-
title = {
|
| 262 |
author = {rrrr66254},
|
| 263 |
year = {2025},
|
| 264 |
publisher = {Hugging Face},
|
| 265 |
-
howpublished = {\url{https://huggingface.co/rrrr66254/
|
| 266 |
}
|
| 267 |
```
|
| 268 |
**APA:**
|
| 269 |
|
| 270 |
-
Kim, J. (2025). *
|
| 271 |
|
| 272 |
---
|
| 273 |
|
|
|
|
| 22 |
|
| 23 |
# BART_SIGN2ENG_refinetuned
|
| 24 |
|
| 25 |
+
**`rrrr66254/Glossa-BART_v2`** is a continuation of the previous fine-tuned model [`Glossa-BART`](https://huggingface.co/rrrr66254/BART_SIGN2ENG_finetuned), further trained to improve gloss-to-English translation accuracy using early stopping, improved validation splitting, and more training epochs.
|
| 26 |
|
| 27 |
|
| 28 |
|
|
|
|
| 36 |
- **Developed by:** Dongjun Kim
|
| 37 |
- **Model type:** Text2Text Generation, Gloss2Eng
|
| 38 |
- **Language(s) (NLP):** English
|
| 39 |
+
- **Finetuned from model:** rrrr66254/Glossa-BART
|
| 40 |
|
| 41 |
## Uses
|
| 42 |
|
|
|
|
| 171 |
The model was evaluated on a held-out validation set using BLEU, ROUGE, and BERTScore.
|
| 172 |
Below is a comparison between the newly fine-tuned model and the previous version:
|
| 173 |
|
| 174 |
+
| Metric | Previous Model (`Glossa-BART`) | This Model (`Glossa-BART_v2`) |
|
| 175 |
|-------------------|---------------------------------------------|-----------------------------|
|
| 176 |
| **BLEU-1** | 0.7063 | 0.7258 |
|
| 177 |
| **BLEU-2** | 0.6175 | 0.6552 |
|
|
|
|
| 257 |
If you use this model in your work, please cite the Hugging Face model page:
|
| 258 |
|
| 259 |
```bibtex
|
| 260 |
+
@misc{rrrr66254Glossa-BART_v2_22025,
|
| 261 |
+
title = {Glossa-BART_v2},
|
| 262 |
author = {rrrr66254},
|
| 263 |
year = {2025},
|
| 264 |
publisher = {Hugging Face},
|
| 265 |
+
howpublished = {\url{https://huggingface.co/rrrr66254/Glossa-BART_v2}}
|
| 266 |
}
|
| 267 |
```
|
| 268 |
**APA:**
|
| 269 |
|
| 270 |
+
Kim, J. (2025). *Glossa-BART_v2* [Computer software]. Hugging Face. https://huggingface.co/rrrr66254/Glossa-BART_v2
|
| 271 |
|
| 272 |
---
|
| 273 |
|