File size: 1,625 Bytes
5826b25 ae04e98 7153a11 5826b25 7153a11 ae04e98 7153a11 ae04e98 7153a11 5826b25 7153a11 5826b25 ae04e98 5826b25 ae04e98 5826b25 ae04e98 5826b25 ae04e98 7153a11 ae04e98 5826b25 ae04e98 5826b25 ae04e98 5826b25 7153a11 5826b25 ae04e98 5826b25 7153a11 ae04e98 7153a11 ae04e98 5826b25 ae04e98 5826b25 ae04e98 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
---
base_model: unsloth/qwen3-vl-8b-instruct-unsloth-bnb-4bit
library_name: peft
model_name: qwen3vl_text_sft
tags:
- base_model:adapter:unsloth/qwen3-vl-8b-instruct-unsloth-bnb-4bit
- lora
- sft
- transformers
- trl
- unsloth
licence: license
pipeline_tag: text-generation
---
# Model Card for qwen3vl_text_sft
This model is a fine-tuned version of [unsloth/qwen3-vl-8b-instruct-unsloth-bnb-4bit](https://huggingface.co/unsloth/qwen3-vl-8b-instruct-unsloth-bnb-4bit).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="None", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with SFT.
### Framework versions
- PEFT 0.18.0
- TRL: 0.24.0
- Transformers: 4.57.2
- Pytorch: 2.9.0+cu126
- Datasets: 4.3.0
- Tokenizers: 0.22.1
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |