OpenSLM Jamba 30M
This is the Jamba model trained with approximately 30M parameters, as part of the OpenSLM project.
Model Details
- Architecture: Jamba
- Parameters: ~30M
- License: Apache 2.0
- Language: English
Performance
Based on the evaluation metrics, the model achieved the following performance:
- Best Validation Loss:
2.4204
Usage
For detailed usage instructions, please refer to the OpenSLM repository.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support