File size: 1,735 Bytes
ddc6258 97beda2 2f20bfd 97beda2 ddc6258 97beda2 ddc6258 2f20bfd ddc6258 97beda2 ddc6258 2f20bfd ddc6258 2f20bfd 97beda2 ddc6258 2f20bfd ddc6258 2f20bfd ddc6258 2f20bfd ddc6258 97beda2 2f20bfd 97beda2 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 | ---
library_name: transformers
datasets:
- alessioGalatolo/AMAeval
language:
- en
base_model:
- Qwen/Qwen2.5-3B-Instruct
---
# Model Card for AMAEval
This is the classifier used as the dynamic benchmark for [AMAeval](https://github.com/alessioGalatolo/AMAeval). Given appropriate reasoning in input, it will give a score in \[0,1\].
## Model Details
### Model Description
This is a fine-tuned version of Qwen/Qwen2.5-3B-Instruct.
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [https://github.com/alessioGalatolo/AMAeval](https://github.com/alessioGalatolo/AMAeval)
- **Paper [optional]:** TBA
## Uses
This model is to be used as described in [https://github.com/alessioGalatolo/AMAeval](https://github.com/alessioGalatolo/AMAeval). Do not use the model outside its intended purpose.
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
The dataset used to train this model is available at [https://huggingface.co/datasets/alessioGalatolo/AMAeval](https://huggingface.co/datasets/alessioGalatolo/AMAeval)
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```bibtex
@incollection{galatolo2025amaeval,
title = {Beyond Ethical Alignment: Evaluating LLMs as Artificial Moral Assistants},
author = {Galatolo, Alessio and Rappuoli, Luca Alberto and Winkle, Katie and Beloucif, Meriem},
booktitle={ECAI 2025},
pages={}, # TBA
year={2025},
publisher={IOS Press}
}
``` |