train-modle2 / README.md
fokan's picture
Upload README.md with huggingface_hub
b0ce300 verified
|
raw
history blame
794 Bytes
metadata
license: apache-2.0
tags:
  - knowledge-distillation
  - pytorch
  - transformers
base_model: google/medsiglip-448

fokan/train-modle2

This model was created using knowledge distillation from the following teacher model(s):

  • google/medsiglip-448

Model Description

A distilled model created using multi-modal knowledge distillation.

Training Details

  • Teacher Models: google/medsiglip-448
  • Distillation Strategy: ensemble
  • Training Steps: 500
  • Learning Rate: 0.0001

Usage

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("fokan/train-modle2")
tokenizer = AutoTokenizer.from_pretrained("google/medsiglip-448")

Created with

This model was created using the Multi-Modal Knowledge Distillation platform.