Fill-Mask
Transformers
PyTorch
English
structformer
custom_code
How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("fill-mask", model="omarmomen/transformer_base_final_2", trust_remote_code=True)
# Load model directly
from transformers import AutoModelForMaskedLM
model = AutoModelForMaskedLM.from_pretrained("omarmomen/transformer_base_final_2", trust_remote_code=True, dtype="auto")
Quick Links

Model Card for omarmomen/transformer_base_final_2

This model is part of the experiments in the published paper at the BabyLM workshop in CoNLL 2023. The paper titled "Increasing The Performance of Cognitively Inspired Data-Efficient Language Models via Implicit Structure Building" (https://aclanthology.org/2023.conll-babylm.29/)

omarmomen/transformer_base_final_2 is a baseline vanilla transformer encoder.

The model is pretrained on the BabyLM 10M dataset using a custom pretrained RobertaTokenizer (https://huggingface.co/omarmomen/babylm_tokenizer_32k).

https://arxiv.org/abs/2310.20589

Downloads last month
13
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train omarmomen/transformer_base_final_2

Paper for omarmomen/transformer_base_final_2