YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

Phi-Small-Merge - bnb 4bits

Original model description:

base_model:

  • Ejafa/phi-3-mini-128k-instruct-simpo-lr-5e-07-gamma-1.5
  • jpacifico/Chocolatine-3B-Instruct-DPO-Revised
  • Antonio88/TaliML-PHI3-128K-ITA-V.1.0.FINAL library_name: transformers tags:
  • mergekit
  • merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:


slices:
  - sources:
    - model: Antonio88/TaliML-PHI3-128K-ITA-V.1.0.FINAL
      layer_range: [0, 32]
  - sources:
    - model: jpacifico/Chocolatine-3B-Instruct-DPO-Revised
      layer_range: [0, 32]
  - sources:
    - model: Ejafa/phi-3-mini-128k-instruct-simpo-lr-5e-07-gamma-1.5
      layer_range: [0, 32]
base_model: jpacifico/Chocolatine-3B-Instruct-DPO-Revised
merge_method: slerp
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
  normalize: false
  int8_mask: true
  density: 0.7
  lambda: 1.1
  epsilon: 0.2
dtype: bfloat16
Downloads last month
1
Safetensors
Model size
11B params
Tensor type
F32
F16
U8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support