File size: 837 Bytes
aaa0818 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 | ---
library_name: transformers
pipeline_tag: text-generation
base_model: deepseek-ai/DeepSeek-R1-0528-Qwen3-8B
tags:
- lora
- peft
- decompilation
- code
---
# decompiler-v3
This repository contains merged fine‑tuned weights for the base model **deepseek-ai/DeepSeek-R1-0528-Qwen3-8B**.
- **Task:** idiomatic decompilation (assembly → high-level code)
- **Training:** LoRA/DoRA adapters trained with TRL SFT on custom assembly→Dart/Swift pairs
- **How to load (merged):**
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
repo_id = "raafatabualazm/decompiler-v3"
tok = AutoTokenizer.from_pretrained(repo_id, use_fast=True)
model = AutoModelForCausalLM.from_pretrained(repo_id, torch_dtype="bfloat16", trust_remote_code=True)
```
> Replace the repo id with your own if you fork or rename this repository.
|