daviBera commited on
Commit
a342a0a
·
verified ·
1 Parent(s): 622b9c5

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -0
README.md ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: peft
3
+ license: other
4
+ base_model: Qwen/Qwen2-VL-2B
5
+ tags:
6
+ - llama-factory
7
+ - lora
8
+ model-index:
9
+ - name: qwen2_2b_lora_expert_generalv2-102400
10
+ results: []
11
+ task_categories:
12
+ - visual-question-answering
13
+ language:
14
+ - en
15
+ pretty_name: Domain Expert Datasets
16
+ size_categories:
17
+ - 100K<n<1M
18
+ ---
19
+
20
+ <h2 align="center">
21
+ Linear Model Merging Unlocks Simple and Scalable Multimodal Data Mixture Optimization
22
+
23
+ <br>
24
+
25
+ [![arXiv](https://img.shields.io/badge/arXiv-2602.04937-b31b1b.svg)](https://www.arxiv.org/pdf/2602.04937)
26
+ [![🤗 Model (HuggingFace)](https://img.shields.io/badge/Models-HuggingFace-FFD21E.svg?logo=huggingface&logoColor=yellow)](https://huggingface.co/collections/daviBera/mllms-merging-4-dmo)
27
+ [![🤗 Dataset (HuggingFace)](https://img.shields.io/badge/Datasets-HuggingFace-FFD21E.svg?logo=huggingface&logoColor=yellow)](https://huggingface.co/datasets/daviBera/experts_datasets-102400)
28
+ [![github](https://img.shields.io/badge/github-repo-blue?logo=github)](https://github.com/BerasiDavide/mLLMs_merging_4_DMO)
29
+ </h2>
30
+
31
+ This are the domain-specific datasets from the paper: "Linear Model Merging Unlocks Simple and Scalable Multimodal Data Mixture Optimization
32
+ " ([link](https://www.arxiv.org/pdf/2602.04937)).
33
+
34
+ Each dataset contains 102400 VQA samples from a specific domain: General VQA, OCR, Counting & Visual Perception, Chart Understanding.
35
+
36
+ You can find many models trained on mixtures of these datasets in [this Huggingface Collection](https://huggingface.co/collections/daviBera/mllms-merging-4-dmo).
37
+
38
+ ### Composition
39
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/6597e62f929cd840d808b8c9/c7X6YXUDUcXIRFsnjHv-w.png" width="800">