| |
|
| | --- |
| | base_model: Meta/tiny-llama |
| | language: ['en', 'es'] |
| | license: apache-2.0 |
| | tags: ['text-generation-inference', 'transformers', 'unsloth', 'mistral', 'gguf'] |
| | datasets: ['iamtarun/python_code_instructions_18k_alpaca', 'jtatman/python-code-dataset-500k', 'flytech/python-codes-25k', 'Vezora/Tested-143k-Python-Alpaca', 'codefuse-ai/CodeExercise-Python-27k', 'Vezora/Tested-22k-Python-Alpaca', 'mlabonne/Evol-Instruct-Python-26k'] |
| | library_name: adapter-transformers |
| | metrics: |
| | - accuracy |
| | - bertscore |
| | - glue |
| | - perplexity |
| | --- |
| | |
| | # Uploaded model |
| |
|
| | [<img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" width="100"/><img src="https://github.githubassets.com/assets/GitHub-Logo-ee398b662d42.png" width="100"/>](https://github.com/Agnuxo1) |
| | - **Developed by:** Agnuxo(https://github.com/Agnuxo1) |
| | - **License:** apache-2.0 |
| | - **Finetuned from model :** Agnuxo/Mistral-NeMo-Minitron-8B-Base-Nebulal |
| |
|
| | This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. |
| |
|
| | [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) |
| |
|
| |
|
| | ## Benchmark Results |
| |
|
| | This model has been fine-tuned for various tasks and evaluated on the following benchmarks: |
| |
|
| | ### accuracy |
| | **Accuracy:** Not Available |
| |
|
| |  |
| |
|
| | ### bertscore |
| | **Bertscore:** Not Available |
| |
|
| |  |
| |
|
| | ### glue |
| | **Glue:** Not Available |
| |
|
| |  |
| |
|
| | ### perplexity |
| | **Perplexity:** Not Available |
| |
|
| |  |
| |
|
| |
|
| | Model Size: 4,124,864 parameters |
| | Required Memory: 0.02 GB |
| |
|
| | For more details, visit my [GitHub](https://github.com/Agnuxo1). |
| |
|
| | Thanks for your interest in this model! |
| |
|