OpenSLM_Jamba_30M / README.md
Ashiwn Shirke
Update README.md
13d1495 verified
metadata
language:
  - en
license: apache-2.0
tags:
  - nlp
  - language-model
  - jamba
  - openslm

OpenSLM Jamba 30M

This is the Jamba model trained with approximately 30M parameters, as part of the OpenSLM project.

Model Details

  • Architecture: Jamba
  • Parameters: ~30M
  • License: Apache 2.0
  • Language: English

Performance

Based on the evaluation metrics, the model achieved the following performance:

  • Best Validation Loss: 2.4204

Usage

For detailed usage instructions, please refer to the OpenSLM repository.