|
|
--- |
|
|
license: apache-2.0 |
|
|
datasets: |
|
|
- databricks/databricks-dolly-15k |
|
|
language: |
|
|
- el |
|
|
library_name: transformers |
|
|
tags: |
|
|
- text-generation-inference |
|
|
pipeline_tag: text-generation |
|
|
--- |
|
|
# Model Card for agrimi7.5B-dolly |
|
|
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
|
|
This model is a finetuned (SFT) version of Facbook xglm-7.5B using a machine translated version of the dataset databricks-dolly-15k in Greek language! |
|
|
The purpose is to demonstrate the ability of the specific pretrained model to adapt to instruction following mode by using a relatively small dataset such as the databricks-dolly-15k. |
|
|
|
|
|
|
|
|
## Model Details |
|
|
|
|
|
### Model Description |
|
|
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
|
|
|
|
- **Developed by:** [Andreas Loupasakis](https://github.com/alup) |
|
|
- **Model type:** Causal Language Model |
|
|
- **Language(s) (NLP):** Greek (el) |
|
|
- **License:** Apache-2.0 |
|
|
- **Finetuned from model:** XGLM-7.5B |