bart-perspectives / README.md
helliun's picture
Create README.md
156bdeb
|
raw
history blame
2.4 kB

BART-perspectives Model

Overview

The BART-perspectives model is a sequence-to-sequence Transformer model hosted on Hugging Face Model Hub. Built on top of Facebook's BART-large (specifically the philschmid/bart-large-cnn-samsum finetune), it is specifically designed to extract perspectives from textual data at scale. The model provides an in-depth analysis of the speaker's identity, their emotions, the object of these emotions, and the reason behind these emotions.

Usage

It is designed to be used with the perspectives library:

from perspectives import DataFrame

# Load DataFrame
df = DataFrame(texts = [list of sentences]) 

# Load model
df.load_model() 

# Get perspectives
df.get_perspectives()

# Search
df.search(speaker='...', emotion='...')

You can use also this model directly with a pipeline for text generation:

from transformers import pipeline

# Load the model
generator = pipeline('text-generation', model='helliun/bart-perspectives')

# Get perspective
perspective = generator("Describe the perspective of this text: <your text>", max_length=1024, do_sample=False)
print(perspective)

You can also use it with transformers.AutoTokenizer and transformers.AutoModelForSeq2SeqLM:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

# Load the model
tokenizer = AutoTokenizer.from_pretrained("helliun/bart-perspectives")
model = AutoModelForSeq2SeqLM.from_pretrained("helliun/bart-perspectives")

# Tokenize the sentence
inputs = tokenizer.encode("Describe the perspective for this sentence: <your text>", return_tensors='pt')

# Pass the tensor through the model
results = model.generate(inputs)

# Decode the results
decoded = tokenizer.decode(results[:,0])
print(decoded)

Training and Evaluation

The model was fine-tuned on a diverse corpus encompassing multiple subjects and emotion ranges. Please refer to the model's card on Hugging Face for detailed information about specific datasets, metrics, training, and evaluation processes.

Contributing and Support

Please raise an issue on the Hugging Face Model card if you encounter any problems using the model. Contributions like fine-tuning on additional data or improving the model architecture are always welcome!

License

The model is open source and free to use under the MIT license.