Instructions to use WangA/distilbert-base-finetuned-ctrip with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use WangA/distilbert-base-finetuned-ctrip with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="WangA/distilbert-base-finetuned-ctrip")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("WangA/distilbert-base-finetuned-ctrip") model = AutoModelForSequenceClassification.from_pretrained("WangA/distilbert-base-finetuned-ctrip") - Notebooks
- Google Colab
- Kaggle
TextAttack Model Card
This `distilbert` model was fine-tuned using TextAttack. The model was fine-tuned
for 3 epochs with a batch size of 8,
a maximum sequence length of 512, and an initial learning rate of 3e-05.
Since this was a classification task, the model was trained with a cross-entropy loss function.
The best score the model achieved on this task was 0.9543333333333334, as measured by the
eval set accuracy, found after 3 epochs.
For more information, check out [TextAttack on Github](https://github.com/QData/TextAttack).
- Downloads last month
- 2