Text Generation
Transformers
Safetensors
Danish
Swedish
mistral
Merge
mergekit
text-generation-inference
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("merge-crew/da-sv-task-arithmetic")
model = AutoModelForCausalLM.from_pretrained("merge-crew/da-sv-task-arithmetic")Quick Links
Danish-Swedish Merged Model
This is a merge of the following models, all based on mistralai/Mistral-7B-v0.1:
danish-foundation-models/munin-7b-alpha, continued pretraining on Danish data;timpal0l/Mistral-7B-v0.1-flashback-v2, continued pretraining on Swedish data.
Model Details
- Merged by: Dan Saattrup Nielsen
- Model type: Decoder model, based on
mistralai/Mistral-7B-v0.1 - Language(s): Danish and Swedish
- License: CC-BY-4.0
- Merge configuration:
dict( models=[ dict( model="danish-foundation-models/munin-7b-alpha", parameters=dict( weight=1.0, ), ), dict( model="timpal0l/Mistral-7B-v0.1-flashback-v2", parameters=dict( weight=1.0, ), ), ], merge_method="task_arithmetic", base_model="mistralai/Mistral-7B-v0.1", parameters=dict( int8_mask=True, normalize=True, ), dtype="bfloat16", )
- Downloads last month
- 13
Model tree for merge-crew/da-sv-task-arithmetic
Merge model
this model
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="merge-crew/da-sv-task-arithmetic")