File size: 1,166 Bytes
0afa3be | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 | ---
base_model: meta-llama/Llama-2-7b-hf
tags:
- LoRA
- bittensor
- gradients
license: apache-2.0
---
# Submission for task `submission_test_speed_check`
- Task ID: `sim-task-readme-speed`
- Repo: `submission_test_speed_check`
- Loss: `4.868357499440511`
- Timestamp: 2025-06-30T23:56:20.301535
---
### LoRA Parameters
- Rank: 8
- Alpha: 16
- Dropout: 0.1
- Epochs: 1
- Learning rate: 1e-4
- Batch size: 1
### Example
**Instruction:**
Come si calcola la derivata di \( x^2 \)?
**Response:**
La derivata di \( x^2 \) rispetto a \( x \) è \( 2x \).
### Training Info
This LoRA adapter was trained using a dataset dynamically generated from RAG (FAISS index) based on the task theme.
The dataset consisted of 35 prompt-response examples related to mathematics and programming.
The adapter was trained on CPU in ~7 minutes and uploaded automatically via Hugging Face API.
---
### Training Curve

### License and Usage
This adapter inherits the license and intended usage of the base model Meta LLaMA 2 (Meta AI License),
and is provided here for research purposes only as part of the Bittensor Subnet 56 participation.
|