How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("feature-extraction", model="BueormLLC/CleanGPT")
# Load model directly
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("BueormLLC/CleanGPT")
model = AutoModel.from_pretrained("BueormLLC/CleanGPT")
Quick Links

CleanGPT

This is a clean model based on the GPT-2 small architecture, this model does not have training, it is an untrained model.

why so?

A model with this form is a ready-made model that we can use at any time to train and work on it and not on GPT-2, which may be limited to its old training data, making it impossible to extract its greatest performance.

Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support