Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People
Paper • 2403.03640 • Published • 2
output = llm(
"Once upon a time,",
max_tokens=512,
echo=True
)
print(output)Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far
👨🏻💻Github •📃 Paper • 🌐 Demo • 🤗 ApolloCorpus • 🤗 XMedBench
中文 | English
Apollo-0.5B • 🤗 Apollo-1.8B • 🤗 Apollo-2B • 🤗 Apollo-6B • 🤗 Apollo-7B
Dataset 🤗 ApolloCorpus
[
"string1",
"string2",
...
]
[
[
"q1",
"a1",
"q2",
"a2",
...
],
...
]
[
[
"q1",
"a1",
"q2",
"a2",
...
],
...
]
Evaluation 🤗 XMedBench
EN:
ZH:
ES: Head_qa
FR: Frenchmedmcqa
HI: MMLU_HI
AR: MMLU_Ara
Waiting for Update
Please use the following citation if you intend to use our dataset for training or evaluation:
@misc{wang2024apollo,
title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People},
author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang},
year={2024},
eprint={2403.03640},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
8-bit
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="FreedomIntelligence/Apollo-2B-GGUF", filename="Apollo-2B-q8_0.gguf", )