|
|
--- |
|
|
license: apache-2.0 |
|
|
base_model: PrimeIntellect/Qwen3-0.6B |
|
|
tags: |
|
|
- text-generation |
|
|
- chinese |
|
|
- sft |
|
|
- qwen3 |
|
|
datasets: |
|
|
- ivanleomk/reverse-chinese-poems |
|
|
language: |
|
|
- zh |
|
|
pipeline_tag: text-generation |
|
|
--- |
|
|
|
|
|
# Reverse Chinese Text (SFT) |
|
|
|
|
|
This model is a fine-tuned version of [PrimeIntellect/Qwen3-0.6B](https://huggingface.co/PrimeIntellect/Qwen3-0.6B) trained on the task of reversing Chinese text character-by-character. |
|
|
|
|
|
## Training |
|
|
|
|
|
- **Base Model:** PrimeIntellect/Qwen3-0.6B |
|
|
- **Method:** Supervised Fine-Tuning (SFT) |
|
|
- **Dataset:** [ivanleomk/reverse-chinese-poems](https://huggingface.co/datasets/ivanleomk/reverse-chinese-poems) |
|
|
- **Training Steps:** 200 |
|
|
- **Learning Rate:** 2e-5 |
|
|
- **Batch Size:** 16 |
|
|
- **Framework:** [Prime-RL](https://github.com/PrimeIntellect-ai/prime-rl) |
|
|
|
|
|
## Benchmark Results |
|
|
|
|
|
Evaluated on 1,000 samples from the test set: |
|
|
|
|
|
| Model | Character Accuracy | Exact Match Rate | |
|
|
|-------|-------------------|------------------| |
|
|
| PrimeIntellect/Qwen3-0.6B (base) | 0.10% | 0.00% | |
|
|
| **ivanleomk/reverse-chinese-text (SFT)** | **63.55%** | **9.60%** | |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
|
|
model = AutoModelForCausalLM.from_pretrained("ivanleomk/reverse-chinese-text") |
|
|
tokenizer = AutoTokenizer.from_pretrained("ivanleomk/reverse-chinese-text") |
|
|
|
|
|
messages = [ |
|
|
{"role": "system", "content": "You are a text reversal assistant. Given Chinese text, reverse it character by character."}, |
|
|
{"role": "user", "content": "θ―·ε转δ»₯δΈζεοΌεΊεζζε
"} |
|
|
] |
|
|
|
|
|
input_ids = tokenizer.apply_chat_template(messages, return_tensors="pt") |
|
|
output = model.generate(input_ids, max_new_tokens=100) |
|
|
print(tokenizer.decode(output[0], skip_special_tokens=True)) |
|
|
# Expected: ε
ζζεεΊ |
|
|
``` |
|
|
|
|
|
## License |
|
|
|
|
|
Apache 2.0 |
|
|
|