YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Test98
DeepHermes passed through https://huggingface.co/spaces/Naphula/model_tools/blob/main/vocab_resizer_mistral24B.py
karcher still isnt merging
assert max(tokenizer.vocab.values()) < vocab_size
trying SLERP with magidonia tokenizer instead
base_model: TheDrummer/Magidonia-24B-v4.2.0
architecture: MistralForCausalLM
merge_method: slerp
dtype: bfloat16
slices:
- sources:
- model: TheDrummer/Magidonia-24B-v4.2.0
layer_range: [0, 40]
- model: NousResearch/DeepHermes-3-Mistral-24B-Preview
layer_range: [0, 40]
parameters:
t: 0.5
tokenizer:
source: TheDrummer/Magidonia-24B-v4.2.0
chat_template: auto
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support