Self trained GPT-2 tiny. Around 110M parameters.

The tokenizer is the one from https://huggingface.co/openai-community/gpt2.

It is trained on around 40B tokens.

The evaluation is being conducted now.

License

This model is available under the Apache 2.0 License. Well, also MIT License. So both should be followed.

Discord Server

Join our Discord server here.

Downloads last month
9
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support