stanfordnlp/imdb
Viewer • Updated • 100k • 177k • 370
This repository contains two versions of BERT-based models fine-tuned for sentiment analysis tasks:
Both models are based on the bert-base-uncased pre-trained model from Hugging Face's Transformers library.
These models are intended for binary sentiment analysis of English text data. They can be used to classify text into positive or negative sentiment categories.
from transformers import AutoTokenizer, AutoModelForSequenceClassification
# Load BERT-1
tokenizer_bert1 = AutoTokenizer.from_pretrained("verneylmavt/bert-base-uncased_sentiment-analysis/bert-1")
model_bert1 = AutoModelForSequenceClassification.from_pretrained("verneylmavt/bert-base-uncased_sentiment-analysis/bert-1")
# Load BERT-2
tokenizer_bert2 = AutoTokenizer.from_pretrained("verneylmavt/bert-base-uncased_sentiment-analysis/bert-2")
model_bert2 = AutoModelForSequenceClassification.from_pretrained("verneylmavt/bert-base-uncased_sentiment-analysis/bert-2")
from transformers import pipeline
# Initialize pipelines
sentiment_pipeline_bert1 = pipeline("sentiment-analysis", model=model_bert1, tokenizer=tokenizer_bert1)
sentiment_pipeline_bert2 = pipeline("sentiment-analysis", model=model_bert2, tokenizer=tokenizer_bert2)
# Sample text
text = "I absolutely loved this product! It exceeded my expectations."
# Get predictions
result_bert1 = sentiment_pipeline_bert1(text)
result_bert2 = sentiment_pipeline_bert2(text)
print("BERT-1 Prediction:", result_bert1)
print("BERT-2 Prediction:", result_bert2)
lr (value unspecified)get_linear_schedule_with_warmup)num_epochs = 30.01) and parameters requiring gradients10% of total steps)max_norm=1.02 epochs without improvement in validation lossnum_epochs = 3, training may stop early due to early stoppingThe models are distributed under the same license as the original bert-base-uncased model (Apache License 2.0).
Disclaimer: The models are provided "as is" without warranty of any kind. The author is not responsible for any outcomes resulting from the use of these models.
Base model
google-bert/bert-base-uncased