Instructions to use Shruthi-S/shruthicapstone-bert-qa with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Shruthi-S/shruthicapstone-bert-qa with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="Shruthi-S/shruthicapstone-bert-qa")# Load model directly from transformers import AutoTokenizer, AutoModelForQuestionAnswering tokenizer = AutoTokenizer.from_pretrained("Shruthi-S/shruthicapstone-bert-qa") model = AutoModelForQuestionAnswering.from_pretrained("Shruthi-S/shruthicapstone-bert-qa") - Notebooks
- Google Colab
- Kaggle
shruthicapstone-bert-qa
This model is a fine-tuned version of Shruthi-S/capstone-project-bert-ten on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 5.9574
- Validation Loss: 5.9507
- Epoch: 0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 0.001, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: mixed_float16
Training results
| Train Loss | Validation Loss | Epoch |
|---|---|---|
| 5.9574 | 5.9507 | 0 |
Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.1
- Tokenizers 0.15.1
- Downloads last month
- 8
Model tree for Shruthi-S/shruthicapstone-bert-qa
Base model
Shruthi-S/capstone-project-bert-ten