File size: 736 Bytes
e59ee0c
 
 
 
 
 
3f87cbd
e59ee0c
 
a05fc30
e59ee0c
a05fc30
3f87cbd
e59ee0c
a05fc30
e59ee0c
a05fc30
e59ee0c
 
 
a05fc30
 
3f87cbd
5f8e6a3
e59ee0c
 
 
 
a05fc30
3f87cbd
a05fc30
 
e59ee0c
 
a05fc30
e59ee0c
a05fc30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: apache-2.0
tags:
- knowledge-distillation
- pytorch
- transformers
base_model: unknown
---

# fokan/train-modle2

This model was created using knowledge distillation from the following teacher model(s):


## Model Description

A distilled model created using multi-modal knowledge distillation.

## Training Details

- **Teacher Models**: 
- **Distillation Strategy**: weighted
- **Training Steps**: 5000
- **Learning Rate**: 0.001

## Usage

```python
from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("fokan/train-modle2")
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
```

## Created with

This model was created using the Multi-Modal Knowledge Distillation platform.