LightGPT

LightGPT is a generative language model based on the GPT-2 architecture, pre-trained on unpaired light chain antibody sequence data from the OAS database.
More information can be found on our Github repo.

Downloads last month
-
Safetensors
Model size
85.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for leaBroe/LightGPT

Finetunes
1 model