DmitriyYurckML commited on
Commit
67bf074
·
verified ·
1 Parent(s): d2dc09d

Delete README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +0 -63
README.md DELETED
@@ -1,63 +0,0 @@
1
- ---
2
- base_model: unsloth/magistral-small-2509-unsloth-bnb-4bit
3
- library_name: peft
4
- model_name: gemma-claude
5
- tags:
6
- - base_model:adapter:unsloth/magistral-small-2509-unsloth-bnb-4bit
7
- - lora
8
- - sft
9
- - transformers
10
- - trl
11
- - unsloth
12
- licence: license
13
- pipeline_tag: text-generation
14
- ---
15
-
16
- # Model Card for gemma-claude
17
-
18
- This model is a fine-tuned version of [unsloth/magistral-small-2509-unsloth-bnb-4bit](https://huggingface.co/unsloth/magistral-small-2509-unsloth-bnb-4bit).
19
- It has been trained using [TRL](https://github.com/huggingface/trl).
20
-
21
- ## Quick start
22
-
23
- ```python
24
- from transformers import pipeline
25
-
26
- question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
27
- generator = pipeline("text-generation", model="None", device="cuda")
28
- output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
29
- print(output["generated_text"])
30
- ```
31
-
32
- ## Training procedure
33
-
34
-
35
-
36
-
37
- This model was trained with SFT.
38
-
39
- ### Framework versions
40
-
41
- - PEFT 0.18.1
42
- - TRL: 0.24.0
43
- - Transformers: 4.57.3
44
- - Pytorch: 2.9.1
45
- - Datasets: 4.3.0
46
- - Tokenizers: 0.22.2
47
-
48
- ## Citations
49
-
50
-
51
-
52
- Cite TRL as:
53
-
54
- ```bibtex
55
- @misc{vonwerra2022trl,
56
- title = {{TRL: Transformer Reinforcement Learning}},
57
- author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
58
- year = 2020,
59
- journal = {GitHub repository},
60
- publisher = {GitHub},
61
- howpublished = {\url{https://github.com/huggingface/trl}}
62
- }
63
- ```