devmanpreet's picture
Update README.md
eea6b6f verified
|
raw
history blame
849 Bytes
metadata
license: apache-2.0
datasets:
  - pietrolesci/pubmed-200k-rct
metrics:
  - accuracy
base_model:
  - openai-community/gpt2
tags:
  - medical
  - biology
  - research
  - pubmed

MedGPT — GPT-2 Fine-Tuned on PubMed RCT

MedGPT is a GPT-2 model fine-tuned on the pubmed-200k-rct dataset. It classifies individual sentences from biomedical abstracts into one of five standard sections:

  • Background
  • Objective
  • Methods
  • Results
  • Conclusion

This model is useful for tasks requiring structured understanding or summarization of scientific literature.

Training Details

  • Base Model: gpt2 (124M parameters)
  • Dataset: pietrolesci/pubmed-200k-rct
  • Task: Sentence classification
  • Labels: Background, Objective, Methods, Results, Conclusion
  • Epochs: 1 (partial training)
  • Loss Function: CrossEntropy
  • Optimizer: AdamW