File size: 2,193 Bytes
7c9e9f6 d92cc25 2a67936 5c03833 cfce0cf af22a40 7c9e9f6 9656e51 7c9e9f6 9656e51 7c9e9f6 9656e51 7c9e9f6 9656e51 7c9e9f6 9656e51 7c9e9f6 9656e51 7c9e9f6 9656e51 ead6dda 9656e51 ead6dda 9656e51 7c9e9f6 37e7aeb 7c9e9f6 9656e51 fd1c4fa 9656e51 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 |
---
license: mit
base_model: cardiffnlp/twitter-roberta-base-sentiment-latest
language:
- en
library_name: transformers
tags:
- Roberta
- Sentiment Analysis
widget:
- text: This product is really great!
- text: This product is really bad!
---
# Fine-tuned RoBERTa for Sentiment Analysis on Reviews
This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-sentiment-latest](https://huggingface.co/cardiffnlp/twitter-roberta-base-sentiment-latest) on the [Amazon Reviews dataset](https://www.kaggle.com/datasets/bittlingmayer/amazonreviews) for sentiment analysis.
## Model Details
- **Model Name:** `AnkitAI/reviews-roberta-base-sentiment-analysis`
- **Base Model:** `cardiffnlp/twitter-roberta-base-sentiment-latest`
- **Dataset:** [Amazon Reviews](https://www.kaggle.com/datasets/bittlingmayer/amazonreviews)
- **Fine-tuning:** This model was fine-tuned for sentiment analysis with a classification head for binary sentiment classification (positive and negative).
## Training
The model was trained using the following parameters:
- **Learning Rate:** 2e-5
- **Batch Size:** 16
- **Weight Decay:** 0.01
- **Evaluation Strategy:** Epoch
### Training Details
- **Evaluation Loss:** 0.1049
- **Evaluation Runtime:** 3177.538 seconds
- **Evaluation Samples/Second:** 226.591
- **Evaluation Steps/Second:** 7.081
- **Training Runtime:** 110070.6349 seconds
- **Training Samples/Second:** 78.495
- **Training Steps/Second:** 2.453
- **Training Loss:** 0.0858
- **Evaluation Accuracy:** 97.19%
- **Evaluation Precision:** 97.9%
- **Evaluation Recall:** 97.18%
- **Evaluation F1 Score:** 97.19%
## Usage
You can use this model directly with the Hugging Face `transformers` library:
```python
from transformers import RobertaForSequenceClassification, RobertaTokenizer
model_name = "AnkitAI/reviews-roberta-base-sentiment-analysis"
model = RobertaForSequenceClassification.from_pretrained(model_name)
tokenizer = RobertaTokenizer.from_pretrained(model_name)
# Example usage
inputs = tokenizer("This product is great!", return_tensors="pt")
outputs = model(**inputs) # 1 for positive, 0 for negative
```
## License
This model is licensed under the [MIT License](LICENSE).
|