IkIzma commited on
Commit
010cf80
·
1 Parent(s): e76cc46

Upload FineTune.ipynb

Browse files
Files changed (1) hide show
  1. FineTune.ipynb +877 -0
FineTune.ipynb ADDED
@@ -0,0 +1,877 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "markdown",
5
+ "metadata": {
6
+ "id": "BO7MEGbb6mtB"
7
+ },
8
+ "source": [
9
+ "# Finetune \n",
10
+ "Finetuning RuGPTs model with huggingface.\n",
11
+ "\n",
12
+ "## Install env"
13
+ ]
14
+ },
15
+ {
16
+ "cell_type": "code",
17
+ "execution_count": 1,
18
+ "metadata": {
19
+ "collapsed": true,
20
+ "id": "Xyhc5yrzR75j"
21
+ },
22
+ "outputs": [
23
+ {
24
+ "name": "stderr",
25
+ "output_type": "stream",
26
+ "text": [
27
+ "Cloning into 'transformers'...\n"
28
+ ]
29
+ },
30
+ {
31
+ "name": "stdout",
32
+ "output_type": "stream",
33
+ "text": [
34
+ "Defaulting to user installation because normal site-packages is not writeable\n",
35
+ "Processing /home/kamil/Documents/SHAD/ML/Part 2/Seminar 7/transformers\n",
36
+ " Installing build dependencies: started\n",
37
+ " Installing build dependencies: finished with status 'done'\n",
38
+ " Getting requirements to build wheel: started\n",
39
+ " Getting requirements to build wheel: finished with status 'done'\n",
40
+ " Preparing metadata (pyproject.toml): started\n",
41
+ " Preparing metadata (pyproject.toml): finished with status 'done'\n",
42
+ "Requirement already satisfied: tqdm>=4.27 in /home/kamil/.local/lib/python3.10/site-packages (from transformers==4.29.0.dev0) (4.65.0)\n",
43
+ "Requirement already satisfied: regex!=2019.12.17 in /home/kamil/.local/lib/python3.10/site-packages (from transformers==4.29.0.dev0) (2023.3.23)\n",
44
+ "Requirement already satisfied: requests in /home/kamil/.local/lib/python3.10/site-packages (from transformers==4.29.0.dev0) (2.28.2)\n",
45
+ "Requirement already satisfied: pyyaml>=5.1 in /usr/lib/python3/dist-packages (from transformers==4.29.0.dev0) (5.4.1)\n",
46
+ "Requirement already satisfied: filelock in /home/kamil/.local/lib/python3.10/site-packages (from transformers==4.29.0.dev0) (3.10.6)\n",
47
+ "Requirement already satisfied: tokenizers!=0.11.3,<0.14,>=0.11.1 in /home/kamil/.local/lib/python3.10/site-packages (from transformers==4.29.0.dev0) (0.13.3)\n",
48
+ "Requirement already satisfied: packaging>=20.0 in /usr/lib/python3/dist-packages (from transformers==4.29.0.dev0) (21.3)\n",
49
+ "Requirement already satisfied: numpy>=1.17 in /usr/lib/python3/dist-packages (from transformers==4.29.0.dev0) (1.21.5)\n",
50
+ "Requirement already satisfied: huggingface-hub<1.0,>=0.11.0 in /home/kamil/.local/lib/python3.10/site-packages (from transformers==4.29.0.dev0) (0.13.4)\n",
51
+ "Requirement already satisfied: typing-extensions>=3.7.4.3 in /home/kamil/.local/lib/python3.10/site-packages (from huggingface-hub<1.0,>=0.11.0->transformers==4.29.0.dev0) (4.5.0)\n",
52
+ "Requirement already satisfied: certifi>=2017.4.17 in /usr/lib/python3/dist-packages (from requests->transformers==4.29.0.dev0) (2020.6.20)\n",
53
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /home/kamil/.local/lib/python3.10/site-packages (from requests->transformers==4.29.0.dev0) (3.0.1)\n",
54
+ "Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/lib/python3/dist-packages (from requests->transformers==4.29.0.dev0) (1.26.5)\n",
55
+ "Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3/dist-packages (from requests->transformers==4.29.0.dev0) (3.3)\n",
56
+ "Building wheels for collected packages: transformers\n",
57
+ " Building wheel for transformers (pyproject.toml): started\n",
58
+ " Building wheel for transformers (pyproject.toml): finished with status 'done'\n",
59
+ " Created wheel for transformers: filename=transformers-4.29.0.dev0-py3-none-any.whl size=6929166 sha256=280057264eb46bc68355d5c5a1a4d2caff1da9951d55bacbaa62463cbf73296c\n",
60
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-xt8a8mve/wheels/a5/d3/d1/e281e4412399bfd2f44bb86274ac4204a7d53b596a501f2ad1\n",
61
+ "Successfully built transformers\n",
62
+ "Installing collected packages: transformers\n",
63
+ " Attempting uninstall: transformers\n",
64
+ " Found existing installation: transformers 4.27.4\n",
65
+ " Uninstalling transformers-4.27.4:\n",
66
+ " Successfully uninstalled transformers-4.27.4\n",
67
+ "Successfully installed transformers-4.29.0.dev0\n"
68
+ ]
69
+ }
70
+ ],
71
+ "source": [
72
+ "%%bash\n",
73
+ "git clone https://github.com/huggingface/transformers\n",
74
+ "cd transformers\n",
75
+ "pip install ."
76
+ ]
77
+ },
78
+ {
79
+ "cell_type": "code",
80
+ "execution_count": 2,
81
+ "metadata": {
82
+ "collapsed": true,
83
+ "id": "Os4vOL5LTOmk"
84
+ },
85
+ "outputs": [
86
+ {
87
+ "name": "stdout",
88
+ "output_type": "stream",
89
+ "text": [
90
+ "Defaulting to user installation because normal site-packages is not writeable\n",
91
+ "Collecting datasets\n",
92
+ " Downloading datasets-2.11.0-py3-none-any.whl (468 kB)\n",
93
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m468.7/468.7 kB\u001b[0m \u001b[31m3.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m[36m0:00:01\u001b[0mm eta \u001b[36m0:00:01\u001b[0m\n",
94
+ "\u001b[?25hRequirement already satisfied: numpy>=1.17 in /usr/lib/python3/dist-packages (from datasets) (1.21.5)\n",
95
+ "Requirement already satisfied: pyarrow>=8.0.0 in /home/kamil/.local/lib/python3.10/site-packages (from datasets) (11.0.0)\n",
96
+ "Requirement already satisfied: tqdm>=4.62.1 in /home/kamil/.local/lib/python3.10/site-packages (from datasets) (4.65.0)\n",
97
+ "Requirement already satisfied: packaging in /usr/lib/python3/dist-packages (from datasets) (21.3)\n",
98
+ "Collecting dill<0.3.7,>=0.3.0\n",
99
+ " Downloading dill-0.3.6-py3-none-any.whl (110 kB)\n",
100
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m110.5/110.5 kB\u001b[0m \u001b[31m2.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\n",
101
+ "\u001b[?25hRequirement already satisfied: pandas in /home/kamil/.local/lib/python3.10/site-packages (from datasets) (1.5.3)\n",
102
+ "Requirement already satisfied: huggingface-hub<1.0.0,>=0.11.0 in /home/kamil/.local/lib/python3.10/site-packages (from datasets) (0.13.4)\n",
103
+ "Collecting aiohttp\n",
104
+ " Downloading aiohttp-3.8.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB)\n",
105
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m1.0/1.0 MB\u001b[0m \u001b[31m9.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m0m eta \u001b[36m0:00:01\u001b[0m0:01\u001b[0m\n",
106
+ "\u001b[?25hRequirement already satisfied: requests>=2.19.0 in /home/kamil/.local/lib/python3.10/site-packages (from datasets) (2.28.2)\n",
107
+ "Collecting fsspec[http]>=2021.11.1\n",
108
+ " Downloading fsspec-2023.4.0-py3-none-any.whl (153 kB)\n",
109
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m154.0/154.0 kB\u001b[0m \u001b[31m3.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0mm eta \u001b[36m0:00:01\u001b[0m\n",
110
+ "\u001b[?25hRequirement already satisfied: pyyaml>=5.1 in /usr/lib/python3/dist-packages (from datasets) (5.4.1)\n",
111
+ "Collecting xxhash\n",
112
+ " Downloading xxhash-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (212 kB)\n",
113
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m212.5/212.5 kB\u001b[0m \u001b[31m4.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m31m10.9 MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\n",
114
+ "\u001b[?25hCollecting responses<0.19\n",
115
+ " Downloading responses-0.18.0-py3-none-any.whl (38 kB)\n",
116
+ "Collecting multiprocess\n",
117
+ " Downloading multiprocess-0.70.14-py310-none-any.whl (134 kB)\n",
118
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m134.3/134.3 kB\u001b[0m \u001b[31m3.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0mm eta \u001b[36m0:00:01\u001b[0m\n",
119
+ "\u001b[?25hRequirement already satisfied: charset-normalizer<4.0,>=2.0 in /home/kamil/.local/lib/python3.10/site-packages (from aiohttp->datasets) (3.0.1)\n",
120
+ "Collecting multidict<7.0,>=4.5\n",
121
+ " Downloading multidict-6.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB)\n",
122
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m114.5/114.5 kB\u001b[0m \u001b[31m2.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m[36m0:00:01\u001b[0m\n",
123
+ "\u001b[?25hRequirement already satisfied: attrs>=17.3.0 in /usr/lib/python3/dist-packages (from aiohttp->datasets) (21.2.0)\n",
124
+ "Collecting async-timeout<5.0,>=4.0.0a3\n",
125
+ " Downloading async_timeout-4.0.2-py3-none-any.whl (5.8 kB)\n",
126
+ "Collecting frozenlist>=1.1.1\n",
127
+ " Downloading frozenlist-1.3.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (149 kB)\n",
128
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m149.6/149.6 kB\u001b[0m \u001b[31m3.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\n",
129
+ "\u001b[?25hCollecting yarl<2.0,>=1.0\n",
130
+ " Downloading yarl-1.8.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (264 kB)\n",
131
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m264.0/264.0 kB\u001b[0m \u001b[31m5.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m31m15.4 MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\n",
132
+ "\u001b[?25hCollecting aiosignal>=1.1.2\n",
133
+ " Downloading aiosignal-1.3.1-py3-none-any.whl (7.6 kB)\n",
134
+ "Requirement already satisfied: typing-extensions>=3.7.4.3 in /home/kamil/.local/lib/python3.10/site-packages (from huggingface-hub<1.0.0,>=0.11.0->datasets) (4.5.0)\n",
135
+ "Requirement already satisfied: filelock in /home/kamil/.local/lib/python3.10/site-packages (from huggingface-hub<1.0.0,>=0.11.0->datasets) (3.10.6)\n",
136
+ "Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/lib/python3/dist-packages (from requests>=2.19.0->datasets) (1.26.5)\n",
137
+ "Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3/dist-packages (from requests>=2.19.0->datasets) (3.3)\n",
138
+ "Requirement already satisfied: certifi>=2017.4.17 in /usr/lib/python3/dist-packages (from requests>=2.19.0->datasets) (2020.6.20)\n",
139
+ "Requirement already satisfied: pytz>=2020.1 in /usr/lib/python3/dist-packages (from pandas->datasets) (2022.1)\n",
140
+ "Requirement already satisfied: python-dateutil>=2.8.1 in /home/kamil/.local/lib/python3.10/site-packages (from pandas->datasets) (2.8.2)\n",
141
+ "Requirement already satisfied: six>=1.5 in /usr/lib/python3/dist-packages (from python-dateutil>=2.8.1->pandas->datasets) (1.16.0)\n",
142
+ "Installing collected packages: xxhash, multidict, fsspec, frozenlist, dill, async-timeout, yarl, responses, multiprocess, aiosignal, aiohttp, datasets\n",
143
+ "Successfully installed aiohttp-3.8.4 aiosignal-1.3.1 async-timeout-4.0.2 datasets-2.11.0 dill-0.3.6 frozenlist-1.3.3 fsspec-2023.4.0 multidict-6.0.4 multiprocess-0.70.14 responses-0.18.0 xxhash-3.2.0 yarl-1.8.2\n"
144
+ ]
145
+ }
146
+ ],
147
+ "source": [
148
+ "!pip install datasets"
149
+ ]
150
+ },
151
+ {
152
+ "cell_type": "code",
153
+ "execution_count": 3,
154
+ "metadata": {
155
+ "collapsed": true,
156
+ "id": "m1P6WSIeTdV5"
157
+ },
158
+ "outputs": [
159
+ {
160
+ "name": "stdout",
161
+ "output_type": "stream",
162
+ "text": [
163
+ "Defaulting to user installation because normal site-packages is not writeable\n",
164
+ "Collecting evaluate\n",
165
+ " Downloading evaluate-0.4.0-py3-none-any.whl (81 kB)\n",
166
+ "\u001b[2K \u001b[38;2;114;156;31m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m81.4/81.4 kB\u001b[0m \u001b[31m1.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\n",
167
+ "\u001b[?25hRequirement already satisfied: xxhash in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (3.2.0)\n",
168
+ "Requirement already satisfied: fsspec[http]>=2021.05.0 in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (2023.4.0)\n",
169
+ "Requirement already satisfied: responses<0.19 in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (0.18.0)\n",
170
+ "Requirement already satisfied: packaging in /usr/lib/python3/dist-packages (from evaluate) (21.3)\n",
171
+ "Requirement already satisfied: tqdm>=4.62.1 in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (4.65.0)\n",
172
+ "Requirement already satisfied: huggingface-hub>=0.7.0 in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (0.13.4)\n",
173
+ "Requirement already satisfied: numpy>=1.17 in /usr/lib/python3/dist-packages (from evaluate) (1.21.5)\n",
174
+ "Requirement already satisfied: dill in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (0.3.6)\n",
175
+ "Requirement already satisfied: requests>=2.19.0 in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (2.28.2)\n",
176
+ "Requirement already satisfied: datasets>=2.0.0 in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (2.11.0)\n",
177
+ "Requirement already satisfied: multiprocess in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (0.70.14)\n",
178
+ "Requirement already satisfied: pandas in /home/kamil/.local/lib/python3.10/site-packages (from evaluate) (1.5.3)\n",
179
+ "Requirement already satisfied: aiohttp in /home/kamil/.local/lib/python3.10/site-packages (from datasets>=2.0.0->evaluate) (3.8.4)\n",
180
+ "Requirement already satisfied: pyyaml>=5.1 in /usr/lib/python3/dist-packages (from datasets>=2.0.0->evaluate) (5.4.1)\n",
181
+ "Requirement already satisfied: pyarrow>=8.0.0 in /home/kamil/.local/lib/python3.10/site-packages (from datasets>=2.0.0->evaluate) (11.0.0)\n",
182
+ "Requirement already satisfied: filelock in /home/kamil/.local/lib/python3.10/site-packages (from huggingface-hub>=0.7.0->evaluate) (3.10.6)\n",
183
+ "Requirement already satisfied: typing-extensions>=3.7.4.3 in /home/kamil/.local/lib/python3.10/site-packages (from huggingface-hub>=0.7.0->evaluate) (4.5.0)\n",
184
+ "Requirement already satisfied: charset-normalizer<4,>=2 in /home/kamil/.local/lib/python3.10/site-packages (from requests>=2.19.0->evaluate) (3.0.1)\n",
185
+ "Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/lib/python3/dist-packages (from requests>=2.19.0->evaluate) (1.26.5)\n",
186
+ "Requirement already satisfied: certifi>=2017.4.17 in /usr/lib/python3/dist-packages (from requests>=2.19.0->evaluate) (2020.6.20)\n",
187
+ "Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3/dist-packages (from requests>=2.19.0->evaluate) (3.3)\n",
188
+ "Requirement already satisfied: python-dateutil>=2.8.1 in /home/kamil/.local/lib/python3.10/site-packages (from pandas->evaluate) (2.8.2)\n",
189
+ "Requirement already satisfied: pytz>=2020.1 in /usr/lib/python3/dist-packages (from pandas->evaluate) (2022.1)\n",
190
+ "Requirement already satisfied: aiosignal>=1.1.2 in /home/kamil/.local/lib/python3.10/site-packages (from aiohttp->datasets>=2.0.0->evaluate) (1.3.1)\n",
191
+ "Requirement already satisfied: frozenlist>=1.1.1 in /home/kamil/.local/lib/python3.10/site-packages (from aiohttp->datasets>=2.0.0->evaluate) (1.3.3)\n",
192
+ "Requirement already satisfied: multidict<7.0,>=4.5 in /home/kamil/.local/lib/python3.10/site-packages (from aiohttp->datasets>=2.0.0->evaluate) (6.0.4)\n",
193
+ "Requirement already satisfied: attrs>=17.3.0 in /usr/lib/python3/dist-packages (from aiohttp->datasets>=2.0.0->evaluate) (21.2.0)\n",
194
+ "Requirement already satisfied: yarl<2.0,>=1.0 in /home/kamil/.local/lib/python3.10/site-packages (from aiohttp->datasets>=2.0.0->evaluate) (1.8.2)\n",
195
+ "Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /home/kamil/.local/lib/python3.10/site-packages (from aiohttp->datasets>=2.0.0->evaluate) (4.0.2)\n",
196
+ "Requirement already satisfied: six>=1.5 in /usr/lib/python3/dist-packages (from python-dateutil>=2.8.1->pandas->evaluate) (1.16.0)\n",
197
+ "Installing collected packages: evaluate\n",
198
+ "Successfully installed evaluate-0.4.0\n"
199
+ ]
200
+ }
201
+ ],
202
+ "source": [
203
+ "!pip install evaluate"
204
+ ]
205
+ },
206
+ {
207
+ "cell_type": "code",
208
+ "execution_count": 3,
209
+ "metadata": {
210
+ "id": "WJZtWu8u6nwL"
211
+ },
212
+ "outputs": [],
213
+ "source": [
214
+ "!mkdir models/"
215
+ ]
216
+ },
217
+ {
218
+ "cell_type": "markdown",
219
+ "metadata": {
220
+ "id": "WqwZiumW8WbZ"
221
+ },
222
+ "source": [
223
+ "## Download files"
224
+ ]
225
+ },
226
+ {
227
+ "cell_type": "code",
228
+ "execution_count": 20,
229
+ "metadata": {
230
+ "collapsed": true,
231
+ "id": "j51bKtQW6nyY"
232
+ },
233
+ "outputs": [
234
+ {
235
+ "name": "stdout",
236
+ "output_type": "stream",
237
+ "text": [
238
+ "--2023-04-16 19:47:12-- https://www.dropbox.com/s/oa3v9c7g9bp40xw/train.txt?dl=0\n",
239
+ "Resolving www.dropbox.com (www.dropbox.com)... 162.125.70.18, 2620:100:6027:18::a27d:4812\n",
240
+ "Connecting to www.dropbox.com (www.dropbox.com)|162.125.70.18|:443... connected.\n",
241
+ "HTTP request sent, awaiting response... 302 Found\n",
242
+ "Location: /s/raw/oa3v9c7g9bp40xw/train.txt [following]\n",
243
+ "--2023-04-16 19:47:13-- https://www.dropbox.com/s/raw/oa3v9c7g9bp40xw/train.txt\n",
244
+ "Reusing existing connection to www.dropbox.com:443.\n",
245
+ "HTTP request sent, awaiting response... 302 Found\n",
246
+ "Location: https://uc5788429f15c026c306ed6aa7c0.dl.dropboxusercontent.com/cd/0/inline/B6QRy9JQtzcR-y7uMF3TBS26D_9WsPQhmzXoWmGuHLgFMVq5YeUy4XIvymTf-coW8njd463mquV6DZB7LKdlznygflsCZHNIJ0A8Hf_yyRl2y5rb63wSIyvyBbANSc5DBKvhD4HSmZ-G8GDlRmEf3CXz-PP4jpoQFXwvDZCbIGlStw/file# [following]\n",
247
+ "--2023-04-16 19:47:13-- https://uc5788429f15c026c306ed6aa7c0.dl.dropboxusercontent.com/cd/0/inline/B6QRy9JQtzcR-y7uMF3TBS26D_9WsPQhmzXoWmGuHLgFMVq5YeUy4XIvymTf-coW8njd463mquV6DZB7LKdlznygflsCZHNIJ0A8Hf_yyRl2y5rb63wSIyvyBbANSc5DBKvhD4HSmZ-G8GDlRmEf3CXz-PP4jpoQFXwvDZCbIGlStw/file\n",
248
+ "Resolving uc5788429f15c026c306ed6aa7c0.dl.dropboxusercontent.com (uc5788429f15c026c306ed6aa7c0.dl.dropboxusercontent.com)... 162.125.70.15, 2620:100:6028:15::a27d:470f\n",
249
+ "Connecting to uc5788429f15c026c306ed6aa7c0.dl.dropboxusercontent.com (uc5788429f15c026c306ed6aa7c0.dl.dropboxusercontent.com)|162.125.70.15|:443... connected.\n",
250
+ "HTTP request sent, awaiting response... 200 OK\n",
251
+ "Length: 1654900 (1,6M) [text/plain]\n",
252
+ "Saving to: ‘train.txt’\n",
253
+ "\n",
254
+ "train.txt 100%[===================>] 1,58M 8,43MB/s in 0,2s \n",
255
+ "\n",
256
+ "2023-04-16 19:47:14 (8,43 MB/s) - ‘train.txt’ saved [1654900/1654900]\n",
257
+ "\n",
258
+ "--2023-04-16 19:47:14-- https://www.dropbox.com/s/mworl3ld6r3bg62/valid.txt?dl=0\n",
259
+ "Resolving www.dropbox.com (www.dropbox.com)... 162.125.70.18, 2620:100:6027:18::a27d:4812\n",
260
+ "Connecting to www.dropbox.com (www.dropbox.com)|162.125.70.18|:443... connected.\n",
261
+ "HTTP request sent, awaiting response... 302 Found\n",
262
+ "Location: /s/raw/mworl3ld6r3bg62/valid.txt [following]\n",
263
+ "--2023-04-16 19:47:14-- https://www.dropbox.com/s/raw/mworl3ld6r3bg62/valid.txt\n",
264
+ "Reusing existing connection to www.dropbox.com:443.\n",
265
+ "HTTP request sent, awaiting response... 302 Found\n",
266
+ "Location: https://uc5ee48fa1d36195fd1fe094947e.dl.dropboxusercontent.com/cd/0/inline/B6QZm3htPxEoOiKlbNIGQz27I0gnkhm3CfT9DoU9qR3VUmFjo8_GWcsquYc01t4LT6WYRj4t70Sw9Z9DhdBPq4ZFpgiGfN4TyCf4Hav48iIButfo1Aaa31uqnVavn3dRVXKM2CZ5ewiMDDEGDexFnB-ZPHZyomgPCjDRtkdkMvfP7g/file# [following]\n",
267
+ "--2023-04-16 19:47:15-- https://uc5ee48fa1d36195fd1fe094947e.dl.dropboxusercontent.com/cd/0/inline/B6QZm3htPxEoOiKlbNIGQz27I0gnkhm3CfT9DoU9qR3VUmFjo8_GWcsquYc01t4LT6WYRj4t70Sw9Z9DhdBPq4ZFpgiGfN4TyCf4Hav48iIButfo1Aaa31uqnVavn3dRVXKM2CZ5ewiMDDEGDexFnB-ZPHZyomgPCjDRtkdkMvfP7g/file\n",
268
+ "Resolving uc5ee48fa1d36195fd1fe094947e.dl.dropboxusercontent.com (uc5ee48fa1d36195fd1fe094947e.dl.dropboxusercontent.com)... 162.125.70.15, 2620:100:6026:15::a27d:460f\n",
269
+ "Connecting to uc5ee48fa1d36195fd1fe094947e.dl.dropboxusercontent.com (uc5ee48fa1d36195fd1fe094947e.dl.dropboxusercontent.com)|162.125.70.15|:443... connected.\n",
270
+ "HTTP request sent, awaiting response... 200 OK\n",
271
+ "Length: 167021 (163K) [text/plain]\n",
272
+ "Saving to: ‘valid.txt’\n",
273
+ "\n",
274
+ "valid.txt 100%[===================>] 163,11K --.-KB/s in 0,08s \n",
275
+ "\n",
276
+ "2023-04-16 19:47:15 (2,02 MB/s) - ‘valid.txt’ saved [167021/167021]\n",
277
+ "\n"
278
+ ]
279
+ }
280
+ ],
281
+ "source": [
282
+ "!wget -O train.txt https://www.dropbox.com/s/oa3v9c7g9bp40xw/train.txt?dl=0\n",
283
+ "!wget -O valid.txt https://www.dropbox.com/s/mworl3ld6r3bg62/valid.txt?dl=0"
284
+ ]
285
+ },
286
+ {
287
+ "cell_type": "markdown",
288
+ "metadata": {
289
+ "id": "zoyX62qN_38l"
290
+ },
291
+ "source": [
292
+ "## Train \n",
293
+ "The following code download model and tokenizer from huggingface and finetune model for generating essays."
294
+ ]
295
+ },
296
+ {
297
+ "cell_type": "code",
298
+ "execution_count": 24,
299
+ "metadata": {
300
+ "collapsed": true,
301
+ "id": "OCIERP8AS1Dl"
302
+ },
303
+ "outputs": [
304
+ {
305
+ "name": "stdout",
306
+ "output_type": "stream",
307
+ "text": [
308
+ "04/16/2023 19:47:40 - WARNING - __main__ - Process rank: -1, device: cuda:0, n_gpu: 1distributed training: False, 16-bits training: False\n",
309
+ "04/16/2023 19:47:40 - INFO - __main__ - Training/evaluation parameters TrainingArguments(\n",
310
+ "_n_gpu=1,\n",
311
+ "adafactor=False,\n",
312
+ "adam_beta1=0.9,\n",
313
+ "adam_beta2=0.999,\n",
314
+ "adam_epsilon=1e-08,\n",
315
+ "auto_find_batch_size=False,\n",
316
+ "bf16=False,\n",
317
+ "bf16_full_eval=False,\n",
318
+ "data_seed=None,\n",
319
+ "dataloader_drop_last=False,\n",
320
+ "dataloader_num_workers=0,\n",
321
+ "dataloader_pin_memory=True,\n",
322
+ "ddp_bucket_cap_mb=None,\n",
323
+ "ddp_find_unused_parameters=None,\n",
324
+ "ddp_timeout=1800,\n",
325
+ "debug=[],\n",
326
+ "deepspeed=None,\n",
327
+ "disable_tqdm=False,\n",
328
+ "do_eval=True,\n",
329
+ "do_predict=False,\n",
330
+ "do_train=True,\n",
331
+ "eval_accumulation_steps=None,\n",
332
+ "eval_delay=0,\n",
333
+ "eval_steps=None,\n",
334
+ "evaluation_strategy=no,\n",
335
+ "fp16=False,\n",
336
+ "fp16_backend=auto,\n",
337
+ "fp16_full_eval=False,\n",
338
+ "fp16_opt_level=O1,\n",
339
+ "fsdp=[],\n",
340
+ "fsdp_config={'fsdp_min_num_params': 0, 'xla': False, 'xla_fsdp_grad_ckpt': False},\n",
341
+ "fsdp_min_num_params=0,\n",
342
+ "fsdp_transformer_layer_cls_to_wrap=None,\n",
343
+ "full_determinism=False,\n",
344
+ "gradient_accumulation_steps=1,\n",
345
+ "gradient_checkpointing=False,\n",
346
+ "greater_is_better=None,\n",
347
+ "group_by_length=False,\n",
348
+ "half_precision_backend=auto,\n",
349
+ "hub_model_id=None,\n",
350
+ "hub_private_repo=False,\n",
351
+ "hub_strategy=every_save,\n",
352
+ "hub_token=<HUB_TOKEN>,\n",
353
+ "ignore_data_skip=False,\n",
354
+ "include_inputs_for_metrics=False,\n",
355
+ "jit_mode_eval=False,\n",
356
+ "label_names=None,\n",
357
+ "label_smoothing_factor=0.0,\n",
358
+ "learning_rate=5e-05,\n",
359
+ "length_column_name=length,\n",
360
+ "load_best_model_at_end=False,\n",
361
+ "local_rank=-1,\n",
362
+ "log_level=passive,\n",
363
+ "log_level_replica=warning,\n",
364
+ "log_on_each_node=True,\n",
365
+ "logging_dir=models/essays2/runs/Apr16_19-47-40_kamil-desktop,\n",
366
+ "logging_first_step=False,\n",
367
+ "logging_nan_inf_filter=True,\n",
368
+ "logging_steps=500,\n",
369
+ "logging_strategy=steps,\n",
370
+ "lr_scheduler_type=linear,\n",
371
+ "max_grad_norm=1.0,\n",
372
+ "max_steps=-1,\n",
373
+ "metric_for_best_model=None,\n",
374
+ "mp_parameters=,\n",
375
+ "no_cuda=False,\n",
376
+ "num_train_epochs=3.0,\n",
377
+ "optim=adamw_hf,\n",
378
+ "optim_args=None,\n",
379
+ "output_dir=models/essays2,\n",
380
+ "overwrite_output_dir=False,\n",
381
+ "past_index=-1,\n",
382
+ "per_device_eval_batch_size=1,\n",
383
+ "per_device_train_batch_size=1,\n",
384
+ "prediction_loss_only=False,\n",
385
+ "push_to_hub=False,\n",
386
+ "push_to_hub_model_id=None,\n",
387
+ "push_to_hub_organization=None,\n",
388
+ "push_to_hub_token=<PUSH_TO_HUB_TOKEN>,\n",
389
+ "ray_scope=last,\n",
390
+ "remove_unused_columns=True,\n",
391
+ "report_to=[],\n",
392
+ "resume_from_checkpoint=None,\n",
393
+ "run_name=models/essays2,\n",
394
+ "save_on_each_node=False,\n",
395
+ "save_safetensors=False,\n",
396
+ "save_steps=500,\n",
397
+ "save_strategy=steps,\n",
398
+ "save_total_limit=None,\n",
399
+ "seed=42,\n",
400
+ "sharded_ddp=[],\n",
401
+ "skip_memory_metrics=True,\n",
402
+ "tf32=None,\n",
403
+ "torch_compile=False,\n",
404
+ "torch_compile_backend=None,\n",
405
+ "torch_compile_mode=None,\n",
406
+ "torchdynamo=None,\n",
407
+ "tpu_metrics_debug=False,\n",
408
+ "tpu_num_cores=None,\n",
409
+ "use_ipex=False,\n",
410
+ "use_legacy_prediction_loop=False,\n",
411
+ "use_mps_device=False,\n",
412
+ "warmup_ratio=0.0,\n",
413
+ "warmup_steps=0,\n",
414
+ "weight_decay=0.0,\n",
415
+ "xpu_backend=None,\n",
416
+ ")\n",
417
+ "04/16/2023 19:47:40 - INFO - datasets.builder - Using custom data configuration default-94a5e2bc6bcfdc2e\n",
418
+ "04/16/2023 19:47:40 - INFO - datasets.info - Loading Dataset Infos from /home/kamil/.local/lib/python3.10/site-packages/datasets/packaged_modules/text\n",
419
+ "04/16/2023 19:47:40 - INFO - datasets.builder - Generating dataset text (/home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2)\n",
420
+ "Downloading and preparing dataset text/default to /home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2...\n",
421
+ "Downloading data files: 100%|██████████████████| 2/2 [00:00<00:00, 18517.90it/s]\n",
422
+ "04/16/2023 19:47:40 - INFO - datasets.download.download_manager - Downloading took 0.0 min\n",
423
+ "04/16/2023 19:47:40 - INFO - datasets.download.download_manager - Checksum Computation took 0.0 min\n",
424
+ "Extracting data files: 100%|█████████████████████| 2/2 [00:00<00:00, 228.71it/s]\n",
425
+ "04/16/2023 19:47:40 - INFO - datasets.builder - Generating train split\n",
426
+ "04/16/2023 19:47:40 - INFO - datasets.builder - Generating validation split\n",
427
+ "04/16/2023 19:47:40 - INFO - datasets.utils.info_utils - Unable to verify splits sizes.\n",
428
+ "Dataset text downloaded and prepared to /home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2. Subsequent calls will reuse this data.\n",
429
+ "100%|███████████████████████████████████████████| 2/2 [00:00<00:00, 1228.20it/s]\n",
430
+ "Downloading (…)lve/main/config.json: 100%|██████| 608/608 [00:00<00:00, 832kB/s]\n",
431
+ "[INFO|configuration_utils.py:668] 2023-04-16 19:47:41,750 >> loading configuration file config.json from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/config.json\n",
432
+ "[INFO|configuration_utils.py:720] 2023-04-16 19:47:41,751 >> Model config GPT2Config {\n",
433
+ " \"_name_or_path\": \"sberbank-ai/rugpt3small_based_on_gpt2\",\n",
434
+ " \"activation_function\": \"gelu_new\",\n",
435
+ " \"architectures\": [\n",
436
+ " \"GPT2LMHeadModel\"\n",
437
+ " ],\n",
438
+ " \"attn_pdrop\": 0.1,\n",
439
+ " \"bos_token_id\": 50256,\n",
440
+ " \"embd_pdrop\": 0.1,\n",
441
+ " \"eos_token_id\": 50256,\n",
442
+ " \"gradient_checkpointing\": false,\n",
443
+ " \"initializer_range\": 0.02,\n",
444
+ " \"layer_norm_epsilon\": 1e-05,\n",
445
+ " \"model_type\": \"gpt2\",\n",
446
+ " \"n_ctx\": 2048,\n",
447
+ " \"n_embd\": 768,\n",
448
+ " \"n_head\": 12,\n",
449
+ " \"n_inner\": null,\n",
450
+ " \"n_layer\": 12,\n",
451
+ " \"n_positions\": 2048,\n",
452
+ " \"reorder_and_upcast_attn\": false,\n",
453
+ " \"resid_pdrop\": 0.1,\n",
454
+ " \"scale_attn_by_inverse_layer_idx\": false,\n",
455
+ " \"scale_attn_weights\": true,\n",
456
+ " \"summary_activation\": null,\n",
457
+ " \"summary_first_dropout\": 0.1,\n",
458
+ " \"summary_proj_to_labels\": true,\n",
459
+ " \"summary_type\": \"cls_index\",\n",
460
+ " \"summary_use_proj\": true,\n",
461
+ " \"transformers_version\": \"4.29.0.dev0\",\n",
462
+ " \"use_cache\": true,\n",
463
+ " \"vocab_size\": 50264\n",
464
+ "}\n",
465
+ "\n",
466
+ "[INFO|tokenization_auto.py:502] 2023-04-16 19:47:42,302 >> Could not locate the tokenizer configuration file, will try to use the model config instead.\n",
467
+ "[INFO|configuration_utils.py:668] 2023-04-16 19:47:42,851 >> loading configuration file config.json from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/config.json\n",
468
+ "[INFO|configuration_utils.py:720] 2023-04-16 19:47:42,852 >> Model config GPT2Config {\n",
469
+ " \"_name_or_path\": \"sberbank-ai/rugpt3small_based_on_gpt2\",\n",
470
+ " \"activation_function\": \"gelu_new\",\n",
471
+ " \"architectures\": [\n",
472
+ " \"GPT2LMHeadModel\"\n",
473
+ " ],\n",
474
+ " \"attn_pdrop\": 0.1,\n",
475
+ " \"bos_token_id\": 50256,\n",
476
+ " \"embd_pdrop\": 0.1,\n",
477
+ " \"eos_token_id\": 50256,\n",
478
+ " \"gradient_checkpointing\": false,\n",
479
+ " \"initializer_range\": 0.02,\n",
480
+ " \"layer_norm_epsilon\": 1e-05,\n",
481
+ " \"model_type\": \"gpt2\",\n",
482
+ " \"n_ctx\": 2048,\n",
483
+ " \"n_embd\": 768,\n",
484
+ " \"n_head\": 12,\n",
485
+ " \"n_inner\": null,\n",
486
+ " \"n_layer\": 12,\n",
487
+ " \"n_positions\": 2048,\n",
488
+ " \"reorder_and_upcast_attn\": false,\n",
489
+ " \"resid_pdrop\": 0.1,\n",
490
+ " \"scale_attn_by_inverse_layer_idx\": false,\n",
491
+ " \"scale_attn_weights\": true,\n",
492
+ " \"summary_activation\": null,\n",
493
+ " \"summary_first_dropout\": 0.1,\n",
494
+ " \"summary_proj_to_labels\": true,\n",
495
+ " \"summary_type\": \"cls_index\",\n",
496
+ " \"summary_use_proj\": true,\n",
497
+ " \"transformers_version\": \"4.29.0.dev0\",\n",
498
+ " \"use_cache\": true,\n",
499
+ " \"vocab_size\": 50264\n",
500
+ "}\n",
501
+ "\n",
502
+ "Downloading (…)olve/main/vocab.json: 100%|█| 1.71M/1.71M [00:00<00:00, 3.73MB/s]\n",
503
+ "Downloading (…)olve/main/merges.txt: 100%|█| 1.27M/1.27M [00:00<00:00, 5.74MB/s]\n",
504
+ "[INFO|tokenization_utils_base.py:1809] 2023-04-16 19:47:47,652 >> loading file vocab.json from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/vocab.json\n",
505
+ "[INFO|tokenization_utils_base.py:1809] 2023-04-16 19:47:47,652 >> loading file merges.txt from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/merges.txt\n",
506
+ "[INFO|tokenization_utils_base.py:1809] 2023-04-16 19:47:47,652 >> loading file tokenizer.json from cache at None\n",
507
+ "[INFO|tokenization_utils_base.py:1809] 2023-04-16 19:47:47,652 >> loading file added_tokens.json from cache at None\n",
508
+ "[INFO|tokenization_utils_base.py:1809] 2023-04-16 19:47:47,652 >> loading file special_tokens_map.json from cache at None\n",
509
+ "[INFO|tokenization_utils_base.py:1809] 2023-04-16 19:47:47,652 >> loading file tokenizer_config.json from cache at None\n",
510
+ "[INFO|configuration_utils.py:668] 2023-04-16 19:47:47,652 >> loading configuration file config.json from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/config.json\n",
511
+ "[INFO|configuration_utils.py:720] 2023-04-16 19:47:47,653 >> Model config GPT2Config {\n",
512
+ " \"_name_or_path\": \"sberbank-ai/rugpt3small_based_on_gpt2\",\n",
513
+ " \"activation_function\": \"gelu_new\",\n",
514
+ " \"architectures\": [\n",
515
+ " \"GPT2LMHeadModel\"\n",
516
+ " ],\n",
517
+ " \"attn_pdrop\": 0.1,\n",
518
+ " \"bos_token_id\": 50256,\n",
519
+ " \"embd_pdrop\": 0.1,\n",
520
+ " \"eos_token_id\": 50256,\n",
521
+ " \"gradient_checkpointing\": false,\n",
522
+ " \"initializer_range\": 0.02,\n",
523
+ " \"layer_norm_epsilon\": 1e-05,\n",
524
+ " \"model_type\": \"gpt2\",\n",
525
+ " \"n_ctx\": 2048,\n",
526
+ " \"n_embd\": 768,\n",
527
+ " \"n_head\": 12,\n",
528
+ " \"n_inner\": null,\n",
529
+ " \"n_layer\": 12,\n",
530
+ " \"n_positions\": 2048,\n",
531
+ " \"reorder_and_upcast_attn\": false,\n",
532
+ " \"resid_pdrop\": 0.1,\n",
533
+ " \"scale_attn_by_inverse_layer_idx\": false,\n",
534
+ " \"scale_attn_weights\": true,\n",
535
+ " \"summary_activation\": null,\n",
536
+ " \"summary_first_dropout\": 0.1,\n",
537
+ " \"summary_proj_to_labels\": true,\n",
538
+ " \"summary_type\": \"cls_index\",\n",
539
+ " \"summary_use_proj\": true,\n",
540
+ " \"transformers_version\": \"4.29.0.dev0\",\n",
541
+ " \"use_cache\": true,\n",
542
+ " \"vocab_size\": 50264\n",
543
+ "}\n",
544
+ "\n"
545
+ ]
546
+ },
547
+ {
548
+ "name": "stdout",
549
+ "output_type": "stream",
550
+ "text": [
551
+ "[INFO|configuration_utils.py:668] 2023-04-16 19:47:47,725 >> loading configuration file config.json from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/config.json\n",
552
+ "[INFO|configuration_utils.py:720] 2023-04-16 19:47:47,725 >> Model config GPT2Config {\n",
553
+ " \"_name_or_path\": \"sberbank-ai/rugpt3small_based_on_gpt2\",\n",
554
+ " \"activation_function\": \"gelu_new\",\n",
555
+ " \"architectures\": [\n",
556
+ " \"GPT2LMHeadModel\"\n",
557
+ " ],\n",
558
+ " \"attn_pdrop\": 0.1,\n",
559
+ " \"bos_token_id\": 50256,\n",
560
+ " \"embd_pdrop\": 0.1,\n",
561
+ " \"eos_token_id\": 50256,\n",
562
+ " \"gradient_checkpointing\": false,\n",
563
+ " \"initializer_range\": 0.02,\n",
564
+ " \"layer_norm_epsilon\": 1e-05,\n",
565
+ " \"model_type\": \"gpt2\",\n",
566
+ " \"n_ctx\": 2048,\n",
567
+ " \"n_embd\": 768,\n",
568
+ " \"n_head\": 12,\n",
569
+ " \"n_inner\": null,\n",
570
+ " \"n_layer\": 12,\n",
571
+ " \"n_positions\": 2048,\n",
572
+ " \"reorder_and_upcast_attn\": false,\n",
573
+ " \"resid_pdrop\": 0.1,\n",
574
+ " \"scale_attn_by_inverse_layer_idx\": false,\n",
575
+ " \"scale_attn_weights\": true,\n",
576
+ " \"summary_activation\": null,\n",
577
+ " \"summary_first_dropout\": 0.1,\n",
578
+ " \"summary_proj_to_labels\": true,\n",
579
+ " \"summary_type\": \"cls_index\",\n",
580
+ " \"summary_use_proj\": true,\n",
581
+ " \"transformers_version\": \"4.29.0.dev0\",\n",
582
+ " \"use_cache\": true,\n",
583
+ " \"vocab_size\": 50264\n",
584
+ "}\n",
585
+ "\n",
586
+ "[WARNING|logging.py:280] 2023-04-16 19:47:47,765 >> Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.\n",
587
+ "Downloading pytorch_model.bin: 100%|█████████| 551M/551M [00:36<00:00, 15.2MB/s]\n",
588
+ "[INFO|modeling_utils.py:2534] 2023-04-16 19:48:24,907 >> loading weights file pytorch_model.bin from cache at /home/kamil/.cache/huggingface/hub/models--sberbank-ai--rugpt3small_based_on_gpt2/snapshots/d64244b316057f71e745cc92be1dcfe7853d9d18/pytorch_model.bin\n",
589
+ "[INFO|configuration_utils.py:575] 2023-04-16 19:48:25,102 >> Generate config GenerationConfig {\n",
590
+ " \"_from_model_config\": true,\n",
591
+ " \"bos_token_id\": 50256,\n",
592
+ " \"eos_token_id\": 50256,\n",
593
+ " \"transformers_version\": \"4.29.0.dev0\"\n",
594
+ "}\n",
595
+ "\n",
596
+ "[INFO|modeling_utils.py:3190] 2023-04-16 19:48:26,046 >> All model checkpoint weights were used when initializing GPT2LMHeadModel.\n",
597
+ "\n",
598
+ "[INFO|modeling_utils.py:3198] 2023-04-16 19:48:26,046 >> All the weights of GPT2LMHeadModel were initialized from the model checkpoint at sberbank-ai/rugpt3small_based_on_gpt2.\n",
599
+ "If your task is similar to the task the model of the checkpoint was trained on, you can already use GPT2LMHeadModel for predictions without further training.\n",
600
+ "[INFO|modeling_utils.py:2839] 2023-04-16 19:48:26,570 >> Generation config file not found, using a generation config created from the model config.\n",
601
+ "Running tokenizer on dataset: 0%| | 0/720 [00:00<?, ? examples/s]04/16/2023 19:48:26 - INFO - datasets.arrow_dataset - Caching processed dataset at /home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2/cache-02f845f719a14b5e.arrow\n",
602
+ "Running tokenizer on dataset: 0%| | 0/80 [00:00<?, ? examples/s]04/16/2023 19:48:26 - INFO - datasets.arrow_dataset - Caching processed dataset at /home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2/cache-b826039bbf723e44.arrow\n",
603
+ "Grouping texts in chunks of 2048: 0%| | 0/720 [00:00<?, ? examples/s]04/16/2023 19:48:26 - INFO - datasets.arrow_dataset - Caching processed dataset at /home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2/cache-218f03f685e5499a.arrow\n",
604
+ "Grouping texts in chunks of 2048: 0%| | 0/80 [00:00<?, ? examples/s]04/16/2023 19:48:27 - INFO - datasets.arrow_dataset - Caching processed dataset at /home/kamil/.cache/huggingface/datasets/text/default-94a5e2bc6bcfdc2e/0.0.0/cb1e9bd71a82ad27976be3b12b407850fe2837d80c22c5e03a28949843a8ace2/cache-ad1660cb69988af0.arrow\n",
605
+ "Downloading builder script: 100%|██████████| 4.20k/4.20k [00:00<00:00, 4.48MB/s]\n",
606
+ "/home/kamil/.local/lib/python3.10/site-packages/transformers/optimization.py:391: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\n",
607
+ " warnings.warn(\n",
608
+ "[INFO|trainer.py:1769] 2023-04-16 19:48:28,570 >> ***** Running training *****\n",
609
+ "[INFO|trainer.py:1770] 2023-04-16 19:48:28,570 >> Num examples = 92\n",
610
+ "[INFO|trainer.py:1771] 2023-04-16 19:48:28,570 >> Num Epochs = 3\n",
611
+ "[INFO|trainer.py:1772] 2023-04-16 19:48:28,570 >> Instantaneous batch size per device = 1\n",
612
+ "[INFO|trainer.py:1773] 2023-04-16 19:48:28,570 >> Total train batch size (w. parallel, distributed & accumulation) = 1\n",
613
+ "[INFO|trainer.py:1774] 2023-04-16 19:48:28,570 >> Gradient Accumulation steps = 1\n",
614
+ "[INFO|trainer.py:1775] 2023-04-16 19:48:28,570 >> Total optimization steps = 276\n",
615
+ "[INFO|trainer.py:1776] 2023-04-16 19:48:28,570 >> Number of trainable parameters = 125,231,616\n",
616
+ " 0%| | 0/276 [00:00<?, ?it/s]Traceback (most recent call last):\n",
617
+ " File \"/home/kamil/Documents/SHAD/ML/Part 2/Seminar 7/run_clm.py\", line 635, in <module>\n",
618
+ " main()\n",
619
+ " File \"/home/kamil/Documents/SHAD/ML/Part 2/Seminar 7/run_clm.py\", line 583, in main\n",
620
+ " train_result = trainer.train(resume_from_checkpoint=checkpoint)\n",
621
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/trainer.py\", line 1662, in train\n",
622
+ " return inner_training_loop(\n",
623
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/trainer.py\", line 1929, in _inner_training_loop\n",
624
+ " tr_loss_step = self.training_step(model, inputs)\n",
625
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/trainer.py\", line 2699, in training_step\n",
626
+ " loss = self.compute_loss(model, inputs)\n",
627
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/trainer.py\", line 2731, in compute_loss\n",
628
+ " outputs = model(**inputs)\n",
629
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/torch/nn/modules/module.py\", line 1501, in _call_impl\n",
630
+ " return forward_call(*args, **kwargs)\n",
631
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py\", line 1075, in forward\n",
632
+ " transformer_outputs = self.transformer(\n",
633
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/torch/nn/modules/module.py\", line 1501, in _call_impl\n",
634
+ " return forward_call(*args, **kwargs)\n",
635
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py\", line 899, in forward\n",
636
+ " outputs = block(\n",
637
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/torch/nn/modules/module.py\", line 1501, in _call_impl\n",
638
+ " return forward_call(*args, **kwargs)\n",
639
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py\", line 389, in forward\n",
640
+ " attn_outputs = self.attn(\n",
641
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/torch/nn/modules/module.py\", line 1501, in _call_impl\n",
642
+ " return forward_call(*args, **kwargs)\n",
643
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py\", line 330, in forward\n",
644
+ " attn_output, attn_weights = self._attn(query, key, value, attention_mask, head_mask)\n",
645
+ " File \"/home/kamil/.local/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py\", line 185, in _attn\n",
646
+ " attn_weights = attn_weights / torch.full(\n",
647
+ "torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 192.00 MiB (GPU 0; 7.79 GiB total capacity; 6.07 GiB already allocated; 171.81 MiB free; 6.09 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF\n",
648
+ " 0%| | 0/276 [00:01<?, ?it/s]\n"
649
+ ]
650
+ }
651
+ ],
652
+ "source": [
653
+ "!python3 run_clm.py \\\n",
654
+ " --model_name_or_path sberbank-ai/rugpt3small_based_on_gpt2 \\\n",
655
+ " --train_file train.txt \\\n",
656
+ " --validation_file valid.txt \\\n",
657
+ " --per_device_train_batch_size 1 \\\n",
658
+ " --per_device_eval_batch_size 1 \\\n",
659
+ " --block_size 2048 \\\n",
660
+ " --dataset_config_name plain_text \\\n",
661
+ " --do_train \\\n",
662
+ " --do_eval \\\n",
663
+ " --output_dir models/essays2"
664
+ ]
665
+ },
666
+ {
667
+ "cell_type": "markdown",
668
+ "metadata": {
669
+ "id": "QvgntLymArg3"
670
+ },
671
+ "source": [
672
+ "## Evaluate model"
673
+ ]
674
+ },
675
+ {
676
+ "cell_type": "code",
677
+ "execution_count": 4,
678
+ "metadata": {
679
+ "id": "csHcDJXFDdaW"
680
+ },
681
+ "outputs": [],
682
+ "source": [
683
+ "import numpy as np\n",
684
+ "import torch"
685
+ ]
686
+ },
687
+ {
688
+ "cell_type": "code",
689
+ "execution_count": 5,
690
+ "metadata": {
691
+ "id": "TJxPg-cJDhAB"
692
+ },
693
+ "outputs": [
694
+ {
695
+ "data": {
696
+ "text/plain": [
697
+ "<torch._C.Generator at 0x7fe5314a4c50>"
698
+ ]
699
+ },
700
+ "execution_count": 5,
701
+ "metadata": {},
702
+ "output_type": "execute_result"
703
+ }
704
+ ],
705
+ "source": [
706
+ "np.random.seed(42)\n",
707
+ "torch.manual_seed(42)"
708
+ ]
709
+ },
710
+ {
711
+ "cell_type": "code",
712
+ "execution_count": 6,
713
+ "metadata": {
714
+ "id": "AkUrzKsy_16F"
715
+ },
716
+ "outputs": [],
717
+ "source": [
718
+ "from transformers import GPT2LMHeadModel, GPT2Tokenizer"
719
+ ]
720
+ },
721
+ {
722
+ "cell_type": "code",
723
+ "execution_count": 25,
724
+ "metadata": {
725
+ "id": "x_EMbgO0BTvb"
726
+ },
727
+ "outputs": [],
728
+ "source": [
729
+ "tok = GPT2Tokenizer.from_pretrained(\"models/essays\")"
730
+ ]
731
+ },
732
+ {
733
+ "cell_type": "code",
734
+ "execution_count": 26,
735
+ "metadata": {
736
+ "id": "Fjy0GAuQBYpA"
737
+ },
738
+ "outputs": [],
739
+ "source": [
740
+ "model = GPT2LMHeadModel.from_pretrained(\"models/essays\")"
741
+ ]
742
+ },
743
+ {
744
+ "cell_type": "code",
745
+ "execution_count": 27,
746
+ "metadata": {
747
+ "collapsed": true,
748
+ "id": "irh4H-HDBb6V"
749
+ },
750
+ "outputs": [
751
+ {
752
+ "data": {
753
+ "text/plain": [
754
+ "GPT2LMHeadModel(\n",
755
+ " (transformer): GPT2Model(\n",
756
+ " (wte): Embedding(50264, 768)\n",
757
+ " (wpe): Embedding(2048, 768)\n",
758
+ " (drop): Dropout(p=0.1, inplace=False)\n",
759
+ " (h): ModuleList(\n",
760
+ " (0-11): 12 x GPT2Block(\n",
761
+ " (ln_1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)\n",
762
+ " (attn): GPT2Attention(\n",
763
+ " (c_attn): Conv1D()\n",
764
+ " (c_proj): Conv1D()\n",
765
+ " (attn_dropout): Dropout(p=0.1, inplace=False)\n",
766
+ " (resid_dropout): Dropout(p=0.1, inplace=False)\n",
767
+ " )\n",
768
+ " (ln_2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)\n",
769
+ " (mlp): GPT2MLP(\n",
770
+ " (c_fc): Conv1D()\n",
771
+ " (c_proj): Conv1D()\n",
772
+ " (act): NewGELUActivation()\n",
773
+ " (dropout): Dropout(p=0.1, inplace=False)\n",
774
+ " )\n",
775
+ " )\n",
776
+ " )\n",
777
+ " (ln_f): LayerNorm((768,), eps=1e-05, elementwise_affine=True)\n",
778
+ " )\n",
779
+ " (lm_head): Linear(in_features=768, out_features=50264, bias=False)\n",
780
+ ")"
781
+ ]
782
+ },
783
+ "execution_count": 27,
784
+ "metadata": {},
785
+ "output_type": "execute_result"
786
+ }
787
+ ],
788
+ "source": [
789
+ "model.cuda()"
790
+ ]
791
+ },
792
+ {
793
+ "cell_type": "code",
794
+ "execution_count": 31,
795
+ "metadata": {
796
+ "id": "hQY6A5q7Bd4O"
797
+ },
798
+ "outputs": [],
799
+ "source": [
800
+ "text = \"<s>Тема: «В чем смысл жизни?»\\nСочинение: \"\n",
801
+ "inpt = tok.encode(text, return_tensors=\"pt\")"
802
+ ]
803
+ },
804
+ {
805
+ "cell_type": "code",
806
+ "execution_count": 32,
807
+ "metadata": {
808
+ "id": "1gfJFmeOBj_t"
809
+ },
810
+ "outputs": [
811
+ {
812
+ "name": "stderr",
813
+ "output_type": "stream",
814
+ "text": [
815
+ "The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results.\n",
816
+ "Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.\n"
817
+ ]
818
+ }
819
+ ],
820
+ "source": [
821
+ "out = model.generate(inpt.cuda(), max_length=200, repetition_penalty=5.0, do_sample=True, top_k=5, top_p=0.95, temperature=1)"
822
+ ]
823
+ },
824
+ {
825
+ "cell_type": "code",
826
+ "execution_count": 33,
827
+ "metadata": {
828
+ "colab": {
829
+ "base_uri": "https://localhost:8080/",
830
+ "height": 123
831
+ },
832
+ "id": "gWZ9SUCxB2Ki",
833
+ "outputId": "31d8e1a3-376f-4f27-bd11-ba59a44983eb"
834
+ },
835
+ "outputs": [
836
+ {
837
+ "name": "stdout",
838
+ "output_type": "stream",
839
+ "text": [
840
+ "<s>Тема: «В чем смысл жизни?»\n",
841
+ "Сочинение: 📹Как часто в наше время мы слышим фразу \"жить надо так, чтобы было хорошо всем\". Однако не все могут себе позволить жить по-другому. В современном мире многие люди хотят изменить свою жизнь к лучшему и сделать ее комфортной для всех без исключения граждан нашей страны.</span] (по пьесе Мольера) \n",
842
+ " Существование системы образования является одним из важнейших условий становления цивилизованного общества на земле – формирования личности гражданина как носителя социально значимых ценностей.Начнем с определения понятия образование - наука о человеке или его способности организовывать свои действия во внешней среде посредством усвоения знаний об окружающем нас обществе; рассмотрим основные функции обучения, которые выполняет школьное учреждение : формирование у детей культуры общения со взрослыми людьми через обучение умению общаться при помощи словаря иностранных слов — это важнейший фактор социализации человека от рождения до смерти || • воспитание нравственности между детьми дошкольного возраста /под ред Н\n"
843
+ ]
844
+ }
845
+ ],
846
+ "source": [
847
+ "print(tok.decode(out[0]))"
848
+ ]
849
+ }
850
+ ],
851
+ "metadata": {
852
+ "accelerator": "GPU",
853
+ "colab": {
854
+ "name": "RuGPT3FinetuneHF.ipynb",
855
+ "provenance": []
856
+ },
857
+ "kernelspec": {
858
+ "display_name": "Python 3 (ipykernel)",
859
+ "language": "python",
860
+ "name": "python3"
861
+ },
862
+ "language_info": {
863
+ "codemirror_mode": {
864
+ "name": "ipython",
865
+ "version": 3
866
+ },
867
+ "file_extension": ".py",
868
+ "mimetype": "text/x-python",
869
+ "name": "python",
870
+ "nbconvert_exporter": "python",
871
+ "pygments_lexer": "ipython3",
872
+ "version": "3.10.6"
873
+ }
874
+ },
875
+ "nbformat": 4,
876
+ "nbformat_minor": 1
877
+ }