| language: | |
| - en | |
| license: apache-2.0 | |
| library_name: transformers | |
| base_model: LeroyDyer/Mixtral_AI_CyberTron_Coder | |
| tags: | |
| - 4-bit | |
| - AWQ | |
| - text-generation | |
| - autotrain_compatible | |
| - endpoints_compatible | |
| - text-generation-inference | |
| - transformers | |
| - unsloth | |
| - mistral | |
| - trl | |
| pipeline_tag: text-generation | |
| inference: false | |
| quantized_by: Suparious | |
| # LeroyDyer/Mixtral_AI_CyberTron_Coder AWQ | |
| - Model creator: [LeroyDyer](https://huggingface.co/LeroyDyer) | |
| - Original model: [Mixtral_AI_CyberTron_Coder](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron_Coder) | |
| ## Model Summary | |
| This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. | |
| [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) | |