Quantizations of https://huggingface.co/epfl-llm/meditron-7b
From original readme
Meditron is a suite of open-source medical Large Language Models (LLMs). Meditron-7B is a 7 billion parameters model adapted to the medical domain from Llama-2-7B through continued pretraining on a comprehensively curated medical corpus, including selected PubMed articles, abstracts, a new dataset of internationally-recognized medical guidelines, and general domain data from RedPajama-v1. Meditron-7B, finetuned on relevant training data, outperforms Llama-2-7B and PMC-Llama on multiple medical reasoning tasks.
- Downloads last month
- 12
Hardware compatibility
Log In to add your hardware
1-bit
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit