File size: 625 Bytes
f95eca6
 
 
 
 
 
 
 
 
 
 
 
 
 
97012f2
f95eca6
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
---
datasets:
- ehartford/dolphin
license: apache-2.0
---

**Base Model :** manojpreveen/mpt-30b-v4

**Tool :** MosaicML's llm-foundry (https://github.com/mosaicml/llm-foundry)

**Dataset :** Entire flan1m-GPT4 dataset

**Config yaml with Model Params :** https://huggingface.co/manojpreveen/mpt-30b-v5/blob/main/mpt-30b_v5.yaml

***Description :*** **mosaicml/mpt-30b** -> Finetuning on (Entire flan3m-GPT3.5 dataset for 4 epochs) **iamplus/mpt-30b-v4** -> Finetuning on (Entire flan1m-GPT4 dataset for 4 epochs) -> **iamplus/mpt-30b-v5** 

**Prompt Format :**

```
<system>: [system prompt]

<human>: [question]

<bot>:
```