Instructions to use rootabytes/Rootal-Twi-ASR with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use rootabytes/Rootal-Twi-ASR with PEFT:
from peft import PeftModel from transformers import AutoModelForSeq2SeqLM base_model = AutoModelForSeq2SeqLM.from_pretrained("katrintomanek/whisper-large-v3-turbo_Akan_standardspeech_specaugment") model = PeftModel.from_pretrained(base_model, "rootabytes/Rootal-Twi-ASR") - Notebooks
- Google Colab
- Kaggle
Upload model
Browse files- adapter_config.json +2 -2
adapter_config.json
CHANGED
|
@@ -32,8 +32,8 @@
|
|
| 32 |
"rank_pattern": {},
|
| 33 |
"revision": null,
|
| 34 |
"target_modules": [
|
| 35 |
-
"
|
| 36 |
-
"
|
| 37 |
],
|
| 38 |
"target_parameters": null,
|
| 39 |
"task_type": null,
|
|
|
|
| 32 |
"rank_pattern": {},
|
| 33 |
"revision": null,
|
| 34 |
"target_modules": [
|
| 35 |
+
"q_proj",
|
| 36 |
+
"v_proj"
|
| 37 |
],
|
| 38 |
"target_parameters": null,
|
| 39 |
"task_type": null,
|