axmeeabdhullo commited on
Commit
34ca02a
·
verified ·
1 Parent(s): 5cece9d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -46
README.md CHANGED
@@ -24,7 +24,7 @@ Axya-Mini is a lightweight, efficient adapter-based language model specifically
24
  **Model Type:** Adapter-based Fine-tuned Model
25
  **Base Model:** GPT-2 (openai-community/gpt2)
26
  **Language:** Dhivehi (ދިވެހި)
27
- **Framework:** Quetzal (Revolutionary CPU-optimized training library) + Transformers with Adapter-Transformers**Format:** Safetensors
28
 
29
  ## Key Features
30
 
@@ -75,49 +75,6 @@ pip install quetzal-ai
75
  **Why Quetzal for Dhivehi?**
76
  Quetzal is specifically optimized for low-resource languages like Dhivehi, enabling efficient model training and deployment without requiring expensive GPU infrastructure. This makes it ideal for building NLP models for endangered or underrepresented languages.
77
 
78
- ## Evaluation Results
79
-
80
- | Metric | Score |
81
- |--------|-------|
82
- | Accuracy | [To be filled with benchmark results] |
83
-
84
- ## Usage
85
-
86
- ### Installation
87
-
88
- ```bash
89
- pip install transformers adapter-transformers torch
90
- ```
91
-
92
- ### Quick Start
93
-
94
- ```python
95
- from transformers import AutoTokenizer, AutoModelForCausalLM
96
- from adapters import AutoAdapterModel
97
-
98
- # Load the model and tokenizer
99
- tokenizer = AutoTokenizer.from_pretrained("axmeeabdhullo/axya-mini")
100
- model = AutoAdapterModel.from_pretrained("axmeeabdhullo/axya-mini")
101
-
102
- # Load the adapter
103
- model.load_adapter("axmeeabdhullo/axya-mini")
104
- model.set_active_adapters("axya-mini")
105
-
106
- # Generate text
107
- input_text = "ސްވާލުވި ވާ" # Example Dhivehi text
108
- inputs = tokenizer(input_text, return_tensors="pt")
109
- outputs = model.generate(**inputs, max_length=100)
110
- response = tokenizer.decode(outputs[0], skip_special_tokens=True)
111
- print(response)
112
- ```
113
-
114
- ### Advanced Usage
115
-
116
- ```python
117
- # For question-answering tasks
118
- question = "ދިވެހިރާއްޖެ ރାษްޓްރީ ގަވާސް ކެވެ?" # What is Maldives' national language?
119
- answer = model.generate(...)
120
- ```
121
 
122
  ## Model Performance
123
 
@@ -147,7 +104,7 @@ If you use this model, please cite:
147
  @model{axya_mini,
148
  author = {Abdhullo, Axmee},
149
  title = {Axya-Mini: Dhivehi Language Question-Answering Model},
150
- year = {2024},
151
  publisher = {Hugging Face Model Hub},
152
  howpublished = {https://huggingface.co/axmeeabdhullo/axya-mini}
153
  }
@@ -168,7 +125,7 @@ This model is licensed under the Apache License 2.0. See the LICENSE file for de
168
 
169
  **Axmee Abdhullo**
170
  AI/ML Developer specializing in Dhivehi NLP
171
- [GitHub](https://github.com) | [Hugging Face](https://huggingface.co/axmeeabdhullo)
172
 
173
  ## Contact & Support
174
 
 
24
  **Model Type:** Adapter-based Fine-tuned Model
25
  **Base Model:** GPT-2 (openai-community/gpt2)
26
  **Language:** Dhivehi (ދިވެހި)
27
+ **Framework:** Quetzal (Revolutionary CPU-optimized training library)
28
 
29
  ## Key Features
30
 
 
75
  **Why Quetzal for Dhivehi?**
76
  Quetzal is specifically optimized for low-resource languages like Dhivehi, enabling efficient model training and deployment without requiring expensive GPU infrastructure. This makes it ideal for building NLP models for endangered or underrepresented languages.
77
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
 
79
  ## Model Performance
80
 
 
104
  @model{axya_mini,
105
  author = {Abdhullo, Axmee},
106
  title = {Axya-Mini: Dhivehi Language Question-Answering Model},
107
+ year = {2025},
108
  publisher = {Hugging Face Model Hub},
109
  howpublished = {https://huggingface.co/axmeeabdhullo/axya-mini}
110
  }
 
125
 
126
  **Axmee Abdhullo**
127
  AI/ML Developer specializing in Dhivehi NLP
128
+ [Hugging Face](https://huggingface.co/axmeeabdhullo)
129
 
130
  ## Contact & Support
131