Quick Links
This model receives scrambled or incoherent sentences as input and returns a meaningful sentence using the same words in the input . A form of grammar correction if you may . It was trained on a dataset of permutated sentences derived from wikipedia pages as input with the correct arrangement of words as labels . It is an encoder-decoder model that uses BERT's weight in both it's encoder and decoder .
- Downloads last month
- 6
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("damilojohn/Bert2BertForTextDescrambling") model = AutoModelForSeq2SeqLM.from_pretrained("damilojohn/Bert2BertForTextDescrambling")