File size: 591 Bytes
de4d571 e5057fa de4d571 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
# Funcom-useloss
## Models Description
| directories | Description |
| ----------- | ----------- |
|CodeT5+ | CodeT5+ 220M models |
|llama | Llama models |
|gpt2 | GPT2 models including models for ablation studies |
|pretrained | pretrained purpose-built models for finetuning with SimiLE and BLEU loss |
Our [GitHub repo](https://github.com/apcl-research/funcom-useloss) contains the code for reproduction.
**CAVEAT**: At the time we released the llama models, the original repo that we used to finetune the model has been deleted.
|