saishshinde15 commited on
Commit
625bb2b
·
verified ·
1 Parent(s): 6d729d0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -5,6 +5,8 @@ language:
5
  - en
6
  metrics:
7
  - accuracy
 
 
8
  ---
9
  # Model Card for SentimentTensor
10
 
@@ -76,14 +78,17 @@ predicted_label = outputs.logits.argmax().item()
76
  # Example Usage
77
 
78
  #Load the model and tokenizer
 
79
  model = AutoModelForSequenceClassification.from_pretrained("your-model-name")
80
  tokenizer = AutoTokenizer.from_pretrained("your-tokenizer-name")
81
 
82
  #Tokenize text data
 
83
  text = "This is a great movie!"
84
  tokenized_input = tokenizer(text, return_tensors="pt")
85
 
86
  #Perform sentiment analysis
 
87
  outputs = model(**tokenized_input)
88
  predicted_label = outputs.logits.argmax().item()
89
 
@@ -97,5 +102,4 @@ print(f"Predicted Sentiment: {sentiment_labels[predicted_label]}")
97
  The SentimentTensor model is based on LSTM architecture, which is well-suited for sequence classification tasks like sentiment analysis. It uses long short-term memory cells to capture dependencies in sequential data.
98
 
99
  # Model Card Authors
100
- Saish Shinde
101
-
 
5
  - en
6
  metrics:
7
  - accuracy
8
+ - code_eval
9
+ library_name: adapter-transformers
10
  ---
11
  # Model Card for SentimentTensor
12
 
 
78
  # Example Usage
79
 
80
  #Load the model and tokenizer
81
+
82
  model = AutoModelForSequenceClassification.from_pretrained("your-model-name")
83
  tokenizer = AutoTokenizer.from_pretrained("your-tokenizer-name")
84
 
85
  #Tokenize text data
86
+
87
  text = "This is a great movie!"
88
  tokenized_input = tokenizer(text, return_tensors="pt")
89
 
90
  #Perform sentiment analysis
91
+
92
  outputs = model(**tokenized_input)
93
  predicted_label = outputs.logits.argmax().item()
94
 
 
102
  The SentimentTensor model is based on LSTM architecture, which is well-suited for sequence classification tasks like sentiment analysis. It uses long short-term memory cells to capture dependencies in sequential data.
103
 
104
  # Model Card Authors
105
+ Saish Shinde