Improve model card: Add `library_name` tag, paper link, and GitHub repository
Browse filesThis PR enhances the model card by:
- Adding the `library_name: transformers` metadata tag. This enables the automated "how to use" widget on the Hub, providing users with a quick and verifiable code snippet to interact with the model. Evidence for compatibility with the `transformers` library is found in the existing `Example` code snippet, which imports `transformers`, and the `config.json` file which indicates `"transformers_version": "4.56.0"`.
- Updating the "Paper" section in the content to include a direct link to the paper on Hugging Face (`https://huggingface.co/papers/2509.14008`), improving discoverability.
- Adding a direct link to the GitHub repository (`https://github.com/Hala`) in the "Quick Links" section of the model card's content, making it easier for users to find the associated code.
These changes improve the model card's completeness and usability for the community.
|
@@ -1,12 +1,13 @@
|
|
| 1 |
---
|
| 2 |
-
|
|
|
|
| 3 |
datasets:
|
| 4 |
- hammh0a/Hala-4.6M-SFT
|
| 5 |
language:
|
| 6 |
- ar
|
| 7 |
-
|
| 8 |
-
- QCRI/Fanar-1-9B-Instruct
|
| 9 |
pipeline_tag: text-generation
|
|
|
|
| 10 |
---
|
| 11 |
|
| 12 |
# Hala: Arabic‑Centric Instruction & Translation Models
|
|
@@ -15,7 +16,7 @@ pipeline_tag: text-generation
|
|
| 15 |
<img src="https://i.ibb.co/pvhp1XfJ/halalogo.png" alt="Hala logo" width="550" />
|
| 16 |
</p>
|
| 17 |
|
| 18 |
-
**Paper**:
|
| 19 |
|
| 20 |
**Authors**: Hasan Abed Al Kader Hammoud\*, Mohammad Zbeeb\*, Bernard Ghanem
|
| 21 |
|
|
@@ -29,8 +30,9 @@ pipeline_tag: text-generation
|
|
| 29 |
|
| 30 |
## 🔗 Quick Links
|
| 31 |
|
| 32 |
-
*
|
| 33 |
-
*
|
|
|
|
| 34 |
|
| 35 |
---
|
| 36 |
|
|
@@ -68,7 +70,7 @@ print(out[0]["generated_text"])
|
|
| 68 |
|
| 69 |
### ≤2B parameters
|
| 70 |
|
| 71 |
-
| Size | Model Name | Params | AlGhafa | ArabicMMLU | EXAMS | MadinahQA | AraTrust | ArbMMLU‑HT |
|
| 72 |
| ---- | -------------------------------------- | -----: | ------: | ---------: | ----: | --------: | -------: | ---------: | -------: |
|
| 73 |
| ≤2B | meta-llama/Llama-3.2-1B | 1B | 33.9 | 26.5 | 21.2 | 25.7 | 37.1 | 23.9 | 28.0 |
|
| 74 |
| ≤2B | Qwen/Qwen2-1.5B-Instruct | 1.5B | 53.1 | 49.2 | 35.2 | 45.5 | 68.9 | 37.4 | 48.2 |
|
|
@@ -87,8 +89,8 @@ print(out[0]["generated_text"])
|
|
| 87 |
|
| 88 |
### 7B–9B parameters
|
| 89 |
|
| 90 |
-
| Size
|
| 91 |
-
|
|
| 92 |
| 7B–9B | CohereForAI/c4ai-command-r7b-arabic-02-2025 | 7B | 74.8 | 59.3 | 65.0 | 63.8 | 80.5 | 50.1 | 65.6 |
|
| 93 |
| 7B–9B | JasperV13/Yehia-7B-DPO-Reasoning-preview | 7B | 75.1 | 66.3 | 51.8 | 54.9 | 81.9 | 55.1 | 64.2 |
|
| 94 |
| 7B–9B | Navid-AI/Yehia-7B-preview | 7B | 70.8 | 64.9 | 52.1 | 54.4 | 87.5 | 53.4 | 63.9 |
|
|
|
|
| 1 |
---
|
| 2 |
+
base_model:
|
| 3 |
+
- QCRI/Fanar-1-9B-Instruct
|
| 4 |
datasets:
|
| 5 |
- hammh0a/Hala-4.6M-SFT
|
| 6 |
language:
|
| 7 |
- ar
|
| 8 |
+
license: cc-by-nc-4.0
|
|
|
|
| 9 |
pipeline_tag: text-generation
|
| 10 |
+
library_name: transformers
|
| 11 |
---
|
| 12 |
|
| 13 |
# Hala: Arabic‑Centric Instruction & Translation Models
|
|
|
|
| 16 |
<img src="https://i.ibb.co/pvhp1XfJ/halalogo.png" alt="Hala logo" width="550" />
|
| 17 |
</p>
|
| 18 |
|
| 19 |
+
**Paper**: [Hala Technical Report: Building Arabic‑Centric Instruction & Translation Models at Scale](https://huggingface.co/papers/2509.14008)
|
| 20 |
|
| 21 |
**Authors**: Hasan Abed Al Kader Hammoud\*, Mohammad Zbeeb\*, Bernard Ghanem
|
| 22 |
|
|
|
|
| 30 |
|
| 31 |
## 🔗 Quick Links
|
| 32 |
|
| 33 |
+
* **Code Repository**: [https://github.com/Hala](https://github.com/Hala)
|
| 34 |
+
* **Models & Data (Hugging Face collection)**: [https://huggingface.co/collections/hammh0a/hala-68bf02b34a14b9f22305ab3a](https://huggingface.co/collections/hammh0a/hala-68bf02b34a14b9f22305ab3a)
|
| 35 |
+
* **Contact**: [hasanabedalkader.hammoud@kaust.edu.sa](mailto:hasanabedalkader.hammoud@kaust.edu.sa)
|
| 36 |
|
| 37 |
---
|
| 38 |
|
|
|
|
| 70 |
|
| 71 |
### ≤2B parameters
|
| 72 |
|
| 73 |
+
| Size | Model Name | Params | AlGhafa | ArabicMMLU | EXAMS | MadinahQA | AraTrust | ArbMMLU‑HT | Average |
|
| 74 |
| ---- | -------------------------------------- | -----: | ------: | ---------: | ----: | --------: | -------: | ---------: | -------: |
|
| 75 |
| ≤2B | meta-llama/Llama-3.2-1B | 1B | 33.9 | 26.5 | 21.2 | 25.7 | 37.1 | 23.9 | 28.0 |
|
| 76 |
| ≤2B | Qwen/Qwen2-1.5B-Instruct | 1.5B | 53.1 | 49.2 | 35.2 | 45.5 | 68.9 | 37.4 | 48.2 |
|
|
|
|
| 89 |
|
| 90 |
### 7B–9B parameters
|
| 91 |
|
| 92 |
+
| Size | Model Name | Params | AlGhafa | ArabicMMLU | EXAMS | MadinahQA | AraTrust | ArbMMLU‑HT | Average |
|
| 93 |
+
| ---- | ------------------------------------------- | -----: | ------: | ---------: | ----: | --------: | -------: | ---------: | -------: |
|
| 94 |
| 7B–9B | CohereForAI/c4ai-command-r7b-arabic-02-2025 | 7B | 74.8 | 59.3 | 65.0 | 63.8 | 80.5 | 50.1 | 65.6 |
|
| 95 |
| 7B–9B | JasperV13/Yehia-7B-DPO-Reasoning-preview | 7B | 75.1 | 66.3 | 51.8 | 54.9 | 81.9 | 55.1 | 64.2 |
|
| 96 |
| 7B–9B | Navid-AI/Yehia-7B-preview | 7B | 70.8 | 64.9 | 52.1 | 54.4 | 87.5 | 53.4 | 63.9 |
|