lewtun/github-issues
Viewer • Updated • 3.02k • 815 • 12
How to use srvmishra832/github_issues-dataset-distilbert-base-uncased with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="srvmishra832/github_issues-dataset-distilbert-base-uncased") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("srvmishra832/github_issues-dataset-distilbert-base-uncased")
model = AutoModelForSequenceClassification.from_pretrained("srvmishra832/github_issues-dataset-distilbert-base-uncased")# Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("srvmishra832/github_issues-dataset-distilbert-base-uncased")
model = AutoModelForSequenceClassification.from_pretrained("srvmishra832/github_issues-dataset-distilbert-base-uncased")This model is a fine-tuned version of distilbert-base-uncased on a GitHub issues dataset. It achieves the following results on the evaluation set:
Multi Label Classification on GitHub repository issues.
GitHub issues dataset taken from GitHub issues.
Split the dataset into 80-20 train-test splits. Filtered out the pull requests and issues with no labels.
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.3962 | 1.0 | 114 | 0.2513 | 0.9208 | 0.34 | 0.3542 | 0.3269 |
| 0.2008 | 2.0 | 228 | 0.1847 | 0.9436 | 0.4198 | 0.5862 | 0.3269 |
| 0.1633 | 3.0 | 342 | 0.1608 | 0.9544 | 0.5581 | 0.7059 | 0.4615 |
| 0.1468 | 4.0 | 456 | 0.1519 | 0.9580 | 0.6067 | 0.7297 | 0.5192 |
| 0.1385 | 5.0 | 570 | 0.1495 | 0.9580 | 0.6067 | 0.7297 | 0.5192 |
Base model
distilbert/distilbert-base-uncased
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="srvmishra832/github_issues-dataset-distilbert-base-uncased")