How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="bluesky333/medphi2", trust_remote_code=True)
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("bluesky333/medphi2", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("bluesky333/medphi2", trust_remote_code=True)
Quick Links

Model Summary

MedPhi-2 is a Phi-2, 2.7 billion parameters, further trained for the biomedical domain. It was proposed in MedExQA paper.

🧑‍⚕️ MedExQA

Medical Question Answering Benchmark with Multiple Explanations

📄 Paper • ⏬ Dataset • ⚕️ MedPhi2

Model Details

Model Description

  • Model type: Clinical LLM (Large Language Model)
  • Language(s) (NLP): English
  • License: MIT license
  • Finetuned from model [optional]: Phi-2

Citation

BibTeX:

@article{kim2024medexqa,
  title={MedExQA: Medical Question Answering Benchmark with Multiple Explanations},
  author={Kim, Yunsoo and Wu, Jinge and Abdulle, Yusuf and Wu, Honghan},
  journal={arXiv e-prints},
  pages={arXiv--2406},
  year={2024}
}
Downloads last month
36
Safetensors
Model size
3B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Paper for bluesky333/medphi2