The RSE-BERT-base-10-rel is trained with 10 relations including: 1) entailment 2) contradiction 3) neutral 4) duplicate_question 5) non_duplicate_question 6) paraphrase 7) same_caption 8) qa_entailment 9) qa_not_entailment 10) same_sent The BERT-base-uncased model is used as initialization. It can be used to infer all ten different relations.