Self trained GPT-2 tiny. Around 110M parameters.
The tokenizer is the one from https://huggingface.co/openai-community/gpt2.
It is trained on around 40B tokens.
The evaluation is being conducted now.
License
This model is available under the Apache 2.0 License. Well, also MIT License. So both should be followed.
Discord Server
Join our Discord server here.
- Downloads last month
- 9