Instructions to use aequa-tech/flame-it with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use aequa-tech/flame-it with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="aequa-tech/flame-it")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("aequa-tech/flame-it") model = AutoModelForSequenceClassification.from_pretrained("aequa-tech/flame-it") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -67,7 +67,7 @@ News Headlines:
|
|
| 67 |
- Accelerate 0.30.0
|
| 68 |
|
| 69 |
# How to use this model:
|
| 70 |
-
```
|
| 71 |
model = AutoModelForSequenceClassification.from_pretrained('aequa-tech/flame-it',num_labels=2)
|
| 72 |
tokenizer = AutoTokenizer.from_pretrained("m-polignano-uniba/bert_uncased_L-12_H-768_A-12_italian_alb3rt0")
|
| 73 |
classifier = pipeline("text-classification", model=model, tokenizer=tokenizer)
|
|
|
|
| 67 |
- Accelerate 0.30.0
|
| 68 |
|
| 69 |
# How to use this model:
|
| 70 |
+
```Python
|
| 71 |
model = AutoModelForSequenceClassification.from_pretrained('aequa-tech/flame-it',num_labels=2)
|
| 72 |
tokenizer = AutoTokenizer.from_pretrained("m-polignano-uniba/bert_uncased_L-12_H-768_A-12_italian_alb3rt0")
|
| 73 |
classifier = pipeline("text-classification", model=model, tokenizer=tokenizer)
|