Update README.md
Browse files
README.md
CHANGED
|
@@ -4,19 +4,9 @@ language:
|
|
| 4 |
- ga
|
| 5 |
- en
|
| 6 |
base_model:
|
| 7 |
-
- jmcinern/
|
| 8 |
pipeline_tag: question-answering
|
| 9 |
---
|
| 10 |
|
| 11 |
-
# Model
|
| 12 |
-
|
| 13 |
-
## Acknowledgements
|
| 14 |
-
I would like to thank my academic supervisor Dr. Barry Devereux of Queen's University Belfast for his sustained guidance for rigorous experimentation. I would also like to thank the Abair (abair.ie) research group for their financial support and Irish language technology expertise. I would like to thank Jerry Sweeney, Managing Director at CloudCIX Limited for the generous donation of computing resources. Finally, I would like to Khan-Tung Tran, PhD Candidate at University College Cork, for the open-sourcing of benchmarks, training data and guidance.
|
| 15 |
-
## Citation
|
| 16 |
-
|
| 17 |
-
```bibtex
|
| 18 |
-
@misc{mcinerney2025qomhra,
|
| 19 |
-
author = {Joseph McInerney},
|
| 20 |
-
title = {Qomhrá: A Bilingual Irish–English Large Language Model},
|
| 21 |
-
year = {2025}
|
| 22 |
-
}
|
|
|
|
| 4 |
- ga
|
| 5 |
- en
|
| 6 |
base_model:
|
| 7 |
+
- jmcinern/Qomhra
|
| 8 |
pipeline_tag: question-answering
|
| 9 |
---
|
| 10 |
|
| 11 |
+
# Model
|
| 12 |
+
A activation aware quantized (AWQ) version of Qomhra, focused on retaining Irish and English performance, memory overhead for inference.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|