| base_model: meta-llama/Llama-2-7b-hf | |
| tags: | |
| - LoRA | |
| - bittensor | |
| - gradients | |
| license: apache-2.0 | |
| # Submission for task `submission_test_speed_check` | |
| - Task ID: `sim-task-readme-speed` | |
| - Repo: `submission_test_speed_check` | |
| - Loss: `4.868357499440511` | |
| - Timestamp: 2025-06-30T23:56:20.301535 | |
| --- | |
| ### LoRA Parameters | |
| - Rank: 8 | |
| - Alpha: 16 | |
| - Dropout: 0.1 | |
| - Epochs: 1 | |
| - Learning rate: 1e-4 | |
| - Batch size: 1 | |
| ### Example | |
| **Instruction:** | |
| Come si calcola la derivata di \( x^2 \)? | |
| **Response:** | |
| La derivata di \( x^2 \) rispetto a \( x \) è \( 2x \). | |
| ### Training Info | |
| This LoRA adapter was trained using a dataset dynamically generated from RAG (FAISS index) based on the task theme. | |
| The dataset consisted of 35 prompt-response examples related to mathematics and programming. | |
| The adapter was trained on CPU in ~7 minutes and uploaded automatically via Hugging Face API. | |
| --- | |
| ### Training Curve | |
|  | |
| ### License and Usage | |
| This adapter inherits the license and intended usage of the base model Meta LLaMA 2 (Meta AI License), | |
| and is provided here for research purposes only as part of the Bittensor Subnet 56 participation. | |