| # Funcom-useloss | |
| ## Models Description | |
| | directories | Description | | |
| | ----------- | ----------- | | |
| |CodeT5+ | CodeT5+ 220M models | | |
| |llama | Llama models | | |
| |gpt2 | GPT2 models including models for ablation studies | | |
| |pretrained | pretrained purpose-built models for finetuning with SimiLE and BLEU loss | | |
| Our [GitHub repo](https://github.com/apcl-research/funcom-useloss) contains the code for reproduction. | |
| **CAVEAT**: At the time we released the llama models, the original repo that we used to finetune the model has been deleted. | |