Instructions to use nwhamed/Merged_Model with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use nwhamed/Merged_Model with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("nwhamed/Merged_Model", dtype="auto") - Notebooks
- Google Colab
- Kaggle
gemma_gpt
gemma_gpt is a merge of the following models using mergekit:
🧩 Configuration
"models": [
{
"model": "google/gemma-7b",
"parameters": {
"param1": "value1",
"param2": "value2"
}
},
{
"model": "EleutherAI/gpt-neo-2.7B",
"parameters": {
"param1": "value1",
"param2": "value2"
}
}
]
}
- Downloads last month
- 5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support