LKD_Experience_CV3 / README.md
joshnielsen876's picture
Update README.md
42130f8
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: LKD_Experience_CV3
    results: []
widget:
  - text: >-
      Health A father donated a kidney to save his daughter. Because of his
      donation the physically active 53-yr-old man has been unable to obtain
      private health insurance
    example_title: Example 1
  - text: >-
      lastweektonight A year since John Oliver discussed kidney disease and I'm
      getting ready to donate a kidney
    example_title: Example 2
  - text: >-
      AskReddit [Serious] If you somehow found out that you were a match for a
      total stranger who needed a kidney would you donate one of yours? What are
      your reasons?
    example_title: Example 3

LKD_Experience_CV3

This model is a fine-tuned version of distilbert-base-uncased on an dataset of Reddit comments and posts related to Living Kidney Donation (LKD). This model identifies documents as either describing a personal experience with LKD, or simply sharing news like headlines and noise/nonsense. The first token/word in each document is the name of the subreddit where the post was written.

  • Loss: 0.2443
  • Accuracy: 0.9244

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 48 0.4809 0.7311
No log 2.0 96 0.3551 0.8908
No log 3.0 144 0.2712 0.9244
No log 4.0 192 0.2508 0.9244
No log 5.0 240 0.2443 0.9244

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.0
  • Datasets 2.1.0
  • Tokenizers 0.13.2