Modified Bert-Base Model As Part Of A Project Regarding Open-Ended Conversation MLM (uncased)

Pretrained model on English language using a masked language modeling (MLM) objective. Based on- Bert's base-uncased variant with the idea of fine-tuning and tweaking the model for a specific goal. This model is uncased: Meaning it does not make a difference between english and English.

Disclaimer: I do not own the rights for the architecture and basic model structure. I modified it according to my needs. However, The model still derives from bert-base-uncased, Taken from the huggingface's library of transformers.

Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train Seraphiive/bert-personalized-PreAlpha-uncased