--- base_model: meta-llama/Llama-2-7b-hf tags: - LoRA - bittensor - gradients license: apache-2.0 --- # Submission for task `submission_test_parallel_code` 🧠 Fine-tuned using LoRA on a dynamic dataset generated from RAG. - Task ID: `task-parallel-002` - Repo: `submission_test_parallel_code` - Loss: `4.86825688680013` - Timestamp: 2025-06-30T00:34:27.059690