This model is an experimental version, created by merging the first 16 layers and the last 16 layers of TinyLlama/TinyLlama-1.1B-Chat-v1.0, thereby expanding the intermediate layers. This approach has been taken to explore the potential benefits of combining different aspects of the same model's architecture

Downloads last month
1
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support