File size: 629 Bytes
13d1495 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | ---
language:
- en
license: apache-2.0
tags:
- nlp
- language-model
- jamba
- openslm
---
# OpenSLM Jamba 30M
This is the Jamba model trained with approximately 30M parameters, as part of the [OpenSLM](https://github.com/Ashwin3919/OpenSLM) project.
## Model Details
- **Architecture:** Jamba
- **Parameters:** ~30M
- **License:** Apache 2.0
- **Language:** English
## Performance
Based on the evaluation metrics, the model achieved the following performance:
- **Best Validation Loss:** `2.4204`
## Usage
For detailed usage instructions, please refer to the [OpenSLM repository](https://github.com/Ashwin3919/OpenSLM).
|