# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("SydneyK/dummy")
model = AutoModelForMaskedLM.from_pretrained("SydneyK/dummy")Quick Links
Welcome to my model page!
#Model description [info including architecture version here] original implementation: by: About the model: TLDR info, training procedures, params, and disclaimers. Disclaimer: This is a complete sandbox model and repo. for myself only.
#Intended uses & limitations Use this for: good for these languages: recommended field and domain application:
#How to use Here are some examples pipeline()
#Training data I trained this on the iris dataset (not really tho) It contains measurements and characteristics for 3 different types of iris flowers.
#Training procedure preprocessing, post processing. epochs:4 batch size 10, learning rate: ?? blah blah blah.
#Evaluation results decision threshold: .70
- Downloads last month
- 2
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="SydneyK/dummy")