Instructions to use KrorngAI/TrorYongASR-tiny with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use KrorngAI/TrorYongASR-tiny with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("automatic-speech-recognition", model="KrorngAI/TrorYongASR-tiny", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("KrorngAI/TrorYongASR-tiny", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -355,9 +355,11 @@ For small variant, the training took around 2 hours on 1 A100 GPU.
|
|
| 355 |
|
| 356 |
## Acknowledgement
|
| 357 |
|
| 358 |
-
`LightningAI` and `Google Colab` did not specifically sponsor this project.
|
| 359 |
-
|
| 360 |
-
So, huge thanks to `LightningAI` and `Google Colab`.
|
|
|
|
|
|
|
| 361 |
|
| 362 |
## Model Card Contact
|
| 363 |
|
|
|
|
| 355 |
|
| 356 |
## Acknowledgement
|
| 357 |
|
| 358 |
+
[`LightningAI`](https://lightning.ai) and `Google Colab` did not specifically sponsor this project.
|
| 359 |
+
But, both models are be trained thanks to their free credits.
|
| 360 |
+
So, huge thanks to [`LightningAI`](https://lightning.ai) and `Google Colab`.
|
| 361 |
+
|
| 362 |
+
Thanks to the authors of [`PARSeq`](https://github.com/baudm/parseq/tree/main) and [`Whisper`](https://github.com/openai/whisper/tree/main) for their publicly available sourcecode.
|
| 363 |
|
| 364 |
## Model Card Contact
|
| 365 |
|