selfBERTa / README.md
sidmangalik's picture
Update README.md
7ddaa58 verified
metadata
language: en
tags:
  - text-classification
  - pytorch
  - roberta
  - self-beliefs
  - multi-class-classification
  - multi-label-classification
license: mit
widget:
  - text: I am the coolest person I know.

Overview

Model trained from roberta-large on a dataset of human and LLM annotated self-beliefs for multi-label classification.

Training Details

Model training , hyper-parameters, and evaluation can be found in "Capturing Self-Beliefs in Natural Language" by Mangalik et al. 2024

Inference

A sample way to use this model for classification

from transformers import pipeline
huggingface_model = 'sidmangalik/selfBERTa'
model = RobertaForSequenceClassification.from_pretrained(huggingface_model)
tokenizer = RobertaTokenizerFast.from_pretrained(huggingface_model, max_length = 512, padding="max_length", truncation=True)

texts = ["I am the coolest person I know."]

inputs = tokenizer(texts, max_length=512, padding="max_length", truncation=True, return_tensors='pt')
outputs = model(**inputs)
logits = outputs.logits
soft_logits = torch.softmax(logits, dim=1).tolist()
predicted_classes = np.argmax(soft_logits, axis=1)