MedicalQA / README.md
Bolin97's picture
Update README.md
753a282 verified
metadata
license: apache-2.0
task_categories:
  - text-generation
language:
  - zh
tags:
  - medical
size_categories:
  - 1M<n<10M

MedicalQA-1.4M

MedicalQA is an integrated large-scale, high-quality Chinese SFT dataset, designed for medical knowledge injection into LLMs by SFT or RAG.

Each sample is reviewed by the free LLM (ERNIE-Speed) using our proposed Quality Evaluation Algorithm.

MedArk-KI involving in two types of medical knowledge: Traditional Chinese Medicine (TCM) and Western Medicine(WM). It consists of 4 subsets as shown in the tabel:

Name Volume Author Reviewer Type Source Source Format
DX 5,3554 Doctor Doctor WM Web HTML
HT 135,8093 ChatGPT ERNIE-Speed WM Books, Encyclopedia, Web PDF, TXT, HTML
TCM 3,8294 ERNIE 4.0 ERNIE-Speed TCM Books, KG PDF, Latex, JSON
MB 1,4543 ERNIE-Speed ERNIE-Speed WM Books Latex
Totol 146,4484

1.DX

A total of 5,3554 high-quality Q&A pairs were collected from the DingXiang website, 丁香医生 (https://dxy.com/diseases), and underwent rigorous data cleaning. Each answer was authored by a doctor and reviewed by another doctor, ensuring that all information is accurate and reliable. These Q&A pairs encompass 8 key areas of medical knowledge across 3412 diseases in 31 departments: disease introduction, symptoms, causes, diagnosis, treatment, lifestyle, prevention, and consultation guidance.

2.HT

https://huggingface.co/datasets/FreedomIntelligence/HuatuoGPT2-Pretraining-Instruction

Medical_Web_Corpus_cn Medical_Encyclopedia_cn Medical_Books_cn

3.TCM

https://www.dayi.org.cn/

image/png

4.MB

https://github.com/scienceasdf/medical-books