metadata
license: apache-2.0
datasets:
- qiaojin/PubMedQA
- MedSwin/PubMedQA-map
- MedSwin/PubmedQA-u
- MedSwin/PubMedQA-l
- MedSwin/HealthCareMagic
- MedSwin/iCliniq
language:
- en
base_model:
- MedSwin/MedSwin-7B-KD
- MedSwin/MedSwin-7B-SFT
pipeline_tag: question-answering
metrics:
- bertscore: 0.8494
tags:
- medical
- merge
This is a merge of pre-trained language models created using mergekit.
- Developed by: Medical Swinburne University of Technology AI Team
- Funded by: Swinburne University of Technology
- Language(s): English
- License: Apache 2.0
Merge Details
Merge Method
This model was merged using the TIES (TrIm, Elect, and Merge) merging method, with medalpaca-7b as a base.
- TIES - TrIm, Elect, and Merge (TIES) is a three-step method for merging models. First, redundant parameters are trimmed, then conflicting signs are resolved into an aggregated vector, and finally the parameters whose signs are the same as the aggregate sign are averaged. This method takes into account that some values (redundant and sign disagreement) can degrade performance in the merged model.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model: medalpaca-7b
dtype: bfloat16
merge_method: ties
modules:
default:
slices:
- sources:
- layer_range: [0, 32]
model: medalpaca-sft
parameters:
density: 0.6
weight: 0.3
- layer_range: [0, 32]
model: medalpaca-kd
parameters:
density: 0.6
weight: 0.7
- layer_range: [0, 32]
model: medalpaca-7b
Review all model metrics benchmark via Benchmark Document Preview.### Configuration