DANGDOCAO commited on
Commit
270c44a
·
verified ·
1 Parent(s): e8943c2

Delete HVU_QA

Browse files
HVU_QA/30ktrain.json DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:a1989e727e0bb58732b6cd569c87fe2e474816d69064d0769f34cd307544d2fa
3
- size 11786416
 
 
 
 
HVU_QA/README.md DELETED
@@ -1,285 +0,0 @@
1
- # HVU_QA
2
-
3
- **HVU_QA** is a project dedicated to sharing datasets and tools for **Question Generation Processing (NLP)**, developed and maintained by the research team at **Hung Vuong University (HVU), Phu Tho, Vietnam**.
4
- This project is supported by **Hung Vuong University, Phu Tho, Vietnam**, with the aim of advancing research and applications in low-resource language processing, particularly for the Vietnamese language.
5
-
6
- ---
7
-
8
- ## 📚 Overview
9
-
10
- This repository enables you to:
11
-
12
- 1. Fine-tune the [VietAI/vit5-base](https://huggingface.co/datasets/DANGDOCAO/GeneratingQuestions) model on your own GQ dataset.
13
- 2. Generate multiple, diverse questions given a user-provided text passage (context).
14
-
15
- ---
16
-
17
- ## 📁 Datasets
18
-
19
- * Built following the **SQuAD v2.0 standard**, ensuring compatibility with NLP pipelines.
20
- * Includes tens of thousands of high-quality **Question–Context–Answer triples (QCA)**.
21
- * Suitable for both **training** and **evaluation**.
22
-
23
- ---
24
-
25
- ## 📁 Vietnamese Question Generation Tool
26
-
27
- A **command-line tool** for:
28
-
29
- * **Fine-tuning** a question generation model.
30
- * **Automatically generating questions** from Vietnamese text.
31
-
32
- Built on **Hugging Face Transformers (VietAI/vit5-base)** and **PyTorch**.
33
-
34
- ---
35
-
36
- ## Features
37
-
38
- * Fine-tune a question generation model with SQuAD v2.0 format data.
39
- * Generate diverse and creative questions from text passages.
40
- * Flexible generation parameters (`top-k`, `top-p`, `temperature`, etc.).
41
- * Simple command-line usage.
42
- * GPU support if available.
43
-
44
- ---
45
-
46
- ## 📊 Evaluation Results
47
-
48
- We conducted both **manual evaluation** (500 samples) and **automatic evaluation** (1,000 samples).
49
-
50
- | Evaluation Type | Precision | Recall | F1-Score |
51
- |------------------|-----------|--------|----------|
52
- | Automatic (1000) | 0.85 | 0.83 | 0.84 |
53
- | Manual (500) | 0.88 | 0.86 | 0.87 |
54
-
55
- ➡️ The model generates diverse, grammatically correct, and contextually appropriate questions.
56
-
57
- ---
58
-
59
- ## Creation Process
60
-
61
- The dataset was built using a **4-stage automated pipeline**:
62
-
63
- 1. Select relevant QA websites from trusted sources.
64
- 2. Automatic crawling to collect raw QA pages.
65
- 3. Semantic tag extraction to obtain clean Question–Context–Answer triples.
66
- 4. AI-assisted filtering to remove noisy or inconsistent samples.
67
-
68
- ---
69
-
70
- ## 📝 Quality Evaluation
71
-
72
- A fine-tuned model trained on **HVU_QA (VietAI/vit5-base)** achieved:
73
-
74
- * **BLEU Score**: 90.61
75
- * **Semantic similarity**: 97.0% (cosine ≥ 0.8)
76
- * **Human evaluation**:
77
- * Grammar: **4.58 / 5**
78
- * Usefulness: **4.29 / 5**
79
-
80
- ➡️ These results confirm that **HVU_QA is a high-quality resource** for developing robust FAQ-style question generation models.
81
-
82
- ---
83
-
84
- ## 📂 Project Structure
85
-
86
- ```
87
- .HVU_QA
88
- ├── t5-viet-qg-finetuned/
89
- ├── fine_tune_qg.py
90
- ├── generate_question.py
91
- ├── 30ktrain.json
92
- └── README.md
93
- ```
94
- > All data files are UTF-8 encoded and ready for use in NLP pipelines.
95
-
96
- ---
97
-
98
- ## 🛠️ Requirements
99
-
100
- * Python 3.8+
101
- * PyTorch >= 1.9
102
- * Transformers >= 4.30
103
- * scikit-learn
104
- * Fine-tuned model (download at: [link](https://huggingface.co/datasets/DANGDOCAO/GeneratingQuestions/tree/main))
105
-
106
- ---
107
-
108
- ## ⚙️ Setup
109
-
110
- ### 🛠️ Step 1: Download and Extract
111
-
112
- 1. Download `HVU_QA.zip`
113
- 2. Extract into a folder, e.g.:
114
-
115
- ```
116
- D:\your\HVU_QA
117
- ```
118
-
119
- ### 🛠️ Step 2: Add to Environment Path (if needed)
120
-
121
- 1. Open **System Properties → Environment Variables**
122
- 2. Select `Path` → **Edit** → **New**
123
- 3. Add the path, e.g.:
124
-
125
- ```
126
- D:\your\HVU_QA
127
- ```
128
-
129
- ### 🛠️ Step 3: Open in Visual Studio Code
130
-
131
- ```
132
- File > Open Folder > D:\HVU_QA
133
- ```
134
-
135
- ### 🛠️ Step 4: Install Required Libraries
136
-
137
- Open **Terminal** and run:
138
-
139
- #### Windows (PowerShell)
140
-
141
- **Required only**
142
-
143
- ```powershell
144
- python -m pip install --upgrade pip
145
- pip install torch transformers datasets scikit-learn sentencepiece safetensors
146
- ```
147
-
148
- **Required + Optional**
149
-
150
- ```powershell
151
- python -m pip install --upgrade pip
152
- pip install torch transformers datasets scikit-learn sentencepiece safetensors accelerate tensorboard evaluate sacrebleu rouge-score nltk
153
- ```
154
-
155
- #### Linux / macOS (bash/zsh)
156
-
157
- **Required only**
158
-
159
- ```bash
160
- python3 -m pip install --upgrade pip
161
- pip install torch transformers datasets scikit-learn sentencepiece safetensors
162
- ```
163
-
164
- **Required + Optional**
165
-
166
- ```bash
167
- python3 -m pip install --upgrade pip
168
- pip install torch transformers datasets scikit-learn sentencepiece safetensors accelerate tensorboard evaluate sacrebleu rouge-score nltk
169
- ```
170
-
171
- ✅ Verify installation:
172
-
173
- * Windows (PowerShell)
174
-
175
- ```powershell
176
- python -c "import torch, transformers, datasets, sklearn, sentencepiece, safetensors, accelerate, tensorboard, evaluate, sacrebleu, rouge_score, nltk; print('✅ All dependencies installed correctly!')"
177
- ```
178
-
179
- * Linux/macOS
180
-
181
- ```bash
182
- python3 -c "import torch, transformers, datasets, sklearn, sentencepiece, safetensors, accelerate, tensorboard, evaluate, sacrebleu, rouge_score, nltk; print('✅ All dependencies installed correctly!')"
183
- ```
184
-
185
- ---
186
-
187
- ## Usage
188
-
189
- * Train and evaluate a question generation model.
190
- * Develop Vietnamese NLP tools.
191
- * Conduct linguistic research.
192
-
193
- ### Training (Fine-tuning)
194
-
195
- When you run `fine_tune_qg.py`, the script will:
196
-
197
- 1. Load the dataset from **`30ktrain.json`**
198
- 2. Fine-tune the `VietAI/vit5-base` model
199
- 3. Save the trained model into a new folder named **`t5-viet-qg-finetuned/`**
200
-
201
- Run:
202
-
203
- ```bash
204
- python fine_tune_qg.py
205
- ```
206
-
207
- ### Generating Questions
208
-
209
- ```bash
210
- python generate_question.py
211
- ```
212
-
213
- **Example:**
214
-
215
- ```
216
- Input passage:
217
- Iced milk coffee (Cà phê sữa đá) is a famous drink in Vietnam.
218
-
219
- Number of questions: 5
220
- ```
221
-
222
- ✅ Output:
223
-
224
- 1. What type of coffee is famous in Vietnam?
225
- 2. Why is iced milk coffee popular?
226
- 3. What ingredients are included in iced milk coffee?
227
- 4. Where does iced milk coffee originate from?
228
- 5. How is Vietnamese iced milk coffee prepared?
229
-
230
- ---
231
-
232
- ## ⚙️ Generation Settings
233
-
234
- In `generate_question.py`, you can adjust:
235
-
236
- * `top_k`, `top_p`, `temperature`, `no_repeat_ngram_size`, `repetition_penalty`
237
-
238
- ---
239
-
240
- ## 🤝 Contribution
241
-
242
- We welcome contributions:
243
-
244
- * Open issues
245
- * Submit pull requests
246
- * Suggest improvements or add datasets
247
-
248
- ---
249
-
250
- ## 📄 Citation
251
-
252
- If you use this repository or datasets in research, please cite:
253
-
254
- **Ha Nguyen-Tien, Phuc Le-Hong, Dang Do-Cao, Cuong Nguyen-Hung, Chung Mai-Van. 2025. A Method to Build QA Corpora for Low-Resource Languages. Proceedings of KSE 2025. ACM TALLIP.**
255
-
256
- ### 📚 BibTeX
257
-
258
- ```bibtex
259
- @inproceedings{nguyen2025hvuqa,
260
- title={A Method to Build QA Corpora for Low-Resource Languages},
261
- author={Ha Nguyen-Tien and Phuc Le-Hong and Dang Do-Cao and Cuong Nguyen-Hung and Chung Mai-Van},
262
- booktitle={Proceedings of KSE 2025},
263
- year={2025}
264
- }
265
- ```
266
-
267
- ---
268
-
269
- ## 📬 Contact
270
-
271
- * **Ha Nguyen-Tien** (Corresponding author)
272
- 📧 [nguyentienha@hvu.edu.vn](mailto:nguyentienha@hvu.edu.vn)
273
-
274
- * **Phuc Le-Hong**
275
- 📧 [Lehongphuc20021408@gmail.com](mailto:Lehongphuc20021408@gmail.com)
276
-
277
- * **Dang Do-Cao**
278
- 📧 [docaodang532001@gmail.com](mailto:docaodang532001@gmail.com)
279
-
280
- 📍 Faculty of Engineering and Technology, Hung Vuong University, Phu Tho, Vietnam
281
- 🌐 [https://hvu.edu.vn](https://hvu.edu.vn)
282
-
283
- ---
284
-
285
- *This repository is part of our ongoing effort to support Vietnamese NLP and make language technology more accessible for low-resource and underrepresented languages.*
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/fine_tune_qg.py DELETED
@@ -1,102 +0,0 @@
1
- import json
2
- from datasets import Dataset
3
- from sklearn.model_selection import train_test_split
4
- from transformers import (
5
- T5Tokenizer,
6
- T5ForConditionalGeneration,
7
- TrainingArguments,
8
- Trainer
9
- )
10
-
11
- def load_squad_data(file_path):
12
- with open(file_path, "r", encoding="utf-8") as f:
13
- squad_data = json.load(f)
14
-
15
- data = []
16
- for article in squad_data["data"]:
17
- context = article.get("title", "")
18
- for paragraph in article["paragraphs"]:
19
- for qa in paragraph["qas"]:
20
- if not qa.get("is_impossible", False) and qa.get("answers"):
21
- answer = qa["answers"][0]["text"]
22
- question = qa["question"]
23
- input_text = f"answer: {answer} context: {context}"
24
- data.append({"input": input_text, "target": question})
25
- return data
26
-
27
- def preprocess_function(example, tokenizer, max_input_length=512, max_target_length=64):
28
- model_inputs = tokenizer(
29
- example["input"],
30
- max_length=max_input_length,
31
- padding="max_length",
32
- truncation=True,
33
- )
34
- labels = tokenizer(
35
- text_target=example["target"],
36
- max_length=max_target_length,
37
- padding="max_length",
38
- truncation=True,
39
- )
40
- model_inputs["labels"] = labels["input_ids"]
41
- return model_inputs
42
-
43
- def main():
44
- data_path = "30ktrain.json"
45
- output_dir = "t5-viet-qg-finetuned"
46
- logs_dir = "logs"
47
- model_name = "VietAI/vit5-base"
48
-
49
- print("Tải mô hình và tokenizer...")
50
- tokenizer = T5Tokenizer.from_pretrained(model_name)
51
- model = T5ForConditionalGeneration.from_pretrained(model_name)
52
-
53
- print("Đọc và chia dữ liệu...")
54
- raw_data = load_squad_data(data_path)
55
- train_data, val_data = train_test_split(raw_data, test_size=0.2, random_state=42)
56
-
57
- train_dataset = Dataset.from_list(train_data)
58
- val_dataset = Dataset.from_list(val_data)
59
-
60
- tokenized_train = train_dataset.map(
61
- lambda x: preprocess_function(x, tokenizer),
62
- batched=True,
63
- remove_columns=["input", "target"]
64
- )
65
- tokenized_val = val_dataset.map(
66
- lambda x: preprocess_function(x, tokenizer),
67
- batched=True,
68
- remove_columns=["input", "target"]
69
- )
70
-
71
- print("Cấu hình huấn luyện...")
72
- training_args = TrainingArguments(
73
- output_dir=output_dir,
74
- overwrite_output_dir=True,
75
- per_device_train_batch_size=1,
76
- gradient_accumulation_steps=1,
77
- num_train_epochs=3,
78
- learning_rate=2e-4,
79
- weight_decay=0.01,
80
- warmup_steps=0,
81
- logging_dir=logs_dir,
82
- logging_steps=10,
83
- fp16=False
84
- )
85
-
86
- print("Huấn luyện mô hình...")
87
- trainer = Trainer(
88
- model=model,
89
- args=training_args,
90
- train_dataset=tokenized_train,
91
- eval_dataset=tokenized_val,
92
- tokenizer=tokenizer,
93
- )
94
- trainer.train()
95
-
96
- print("Lưu mô hình...")
97
- model.save_pretrained(output_dir)
98
- tokenizer.save_pretrained(output_dir)
99
- print("Huấn luyện hoàn tất!")
100
-
101
- if __name__ == "__main__":
102
- main()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/generate_question.py DELETED
@@ -1,134 +0,0 @@
1
- import json
2
- from difflib import SequenceMatcher
3
- from transformers import T5Tokenizer, T5ForConditionalGeneration
4
- from transformers.utils import logging as hf_logging
5
-
6
- hf_logging.set_verbosity_error()
7
-
8
- MODEL_DIR = "t5-viet-qg-finetuned"
9
- DATA_PATH = "30ktrain.json"
10
-
11
- tokenizer = T5Tokenizer.from_pretrained(MODEL_DIR)
12
- model = T5ForConditionalGeneration.from_pretrained(MODEL_DIR)
13
-
14
- def find_best_match_from_context(user_context, squad_data):
15
- best_score, best_entry = 0.0, None
16
- ui = user_context.lower()
17
-
18
- for article in squad_data.get("data", []):
19
- context_title = article.get("title", "")
20
- score_title = SequenceMatcher(None, ui, context_title.lower()).ratio()
21
-
22
- for paragraph in article.get("paragraphs", []):
23
- for qa in paragraph.get("qas", []):
24
- answers = qa.get("answers", [])
25
- if not answers:
26
- continue
27
- answer_text = answers[0].get("text", "").strip()
28
- question_text = qa.get("question", "").strip()
29
-
30
- score = score_title
31
- if score > best_score:
32
- best_score = score
33
- best_entry = (context_title, answer_text, question_text)
34
-
35
- return best_entry
36
-
37
- def _near_duplicate(q, seen, thr=0.90):
38
- for s in seen:
39
- if SequenceMatcher(None, q, s).ratio() >= thr:
40
- return True
41
- return False
42
-
43
- def generate_questions(user_context,
44
- total_questions=20,
45
- batch_size=10,
46
- top_k=60,
47
- top_p=0.95,
48
- temperature=0.9,
49
- max_input_len=512,
50
- max_new_tokens=64):
51
- with open(DATA_PATH, "r", encoding="utf-8") as f:
52
- squad_data = json.load(f)
53
-
54
- best_entry = find_best_match_from_context(user_context, squad_data)
55
- if best_entry is None:
56
- print("Không tìm thấy dữ liệu phù hợp trong file JSON.")
57
- return
58
-
59
- _, answer, _ = best_entry
60
-
61
- input_text = f"answer: {answer} context: {user_context}"
62
- inputs = tokenizer(
63
- input_text,
64
- return_tensors="pt",
65
- truncation=True,
66
- max_length=max_input_len
67
- )
68
-
69
- unique_questions = []
70
- remaining = total_questions
71
-
72
- while remaining > 0:
73
- n = min(batch_size, remaining)
74
- outputs = model.generate(
75
- **inputs,
76
- do_sample=True,
77
- top_k=top_k,
78
- top_p=top_p,
79
- temperature=temperature,
80
- max_new_tokens=max_new_tokens,
81
- num_return_sequences=n,
82
- no_repeat_ngram_size=3,
83
- repetition_penalty=1.12
84
- )
85
-
86
- for out in outputs:
87
- q = tokenizer.decode(out, skip_special_tokens=True).strip()
88
- if len(q) < 5:
89
- continue
90
- if not _near_duplicate(q, unique_questions, thr=0.90):
91
- unique_questions.append(q)
92
-
93
- remaining = total_questions - len(unique_questions)
94
- if remaining <= 0:
95
- break
96
-
97
- unique_questions = unique_questions[:total_questions]
98
-
99
- print("Các câu hỏi mới được sinh ra:")
100
- for i, q in enumerate(unique_questions, 1):
101
- print(f"{i}. {q}")
102
-
103
- if __name__ == "__main__":
104
- user_context = input("\nNhập đoạn văn bản:\n ").strip()
105
-
106
- raw_n = input("\nNhập vào số lượng câu hỏi bạn cần:").strip()
107
- if raw_n == "":
108
- total_questions = 20
109
- else:
110
- try:
111
- total_questions = int(raw_n)
112
- except ValueError:
113
- print("Giá trị không hợp lệ. Dùng mặc định 20.")
114
- total_questions = 20
115
-
116
- if total_questions < 1:
117
- total_questions = 1
118
- if total_questions > 200:
119
- total_questions = 200
120
-
121
- batch_size = 20 if total_questions >= 30 else min(20, total_questions)
122
-
123
- print("\nĐang phân tích dữ liệu...\n")
124
-
125
- generate_questions(
126
- user_context=user_context,
127
- total_questions=total_questions,
128
- batch_size=batch_size,
129
- top_k=60,
130
- top_p=0.95,
131
- temperature=0.9,
132
- max_input_len=512,
133
- max_new_tokens=64
134
- )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/added_tokens.json DELETED
@@ -1,98 +0,0 @@
1
- {
2
- "<extra_id_0>": 36095,
3
- "<extra_id_10>": 36085,
4
- "<extra_id_11>": 36084,
5
- "<extra_id_12>": 36083,
6
- "<extra_id_13>": 36082,
7
- "<extra_id_14>": 36081,
8
- "<extra_id_15>": 36080,
9
- "<extra_id_16>": 36079,
10
- "<extra_id_17>": 36078,
11
- "<extra_id_18>": 36077,
12
- "<extra_id_19>": 36076,
13
- "<extra_id_1>": 36094,
14
- "<extra_id_20>": 36075,
15
- "<extra_id_21>": 36074,
16
- "<extra_id_22>": 36073,
17
- "<extra_id_23>": 36072,
18
- "<extra_id_24>": 36071,
19
- "<extra_id_25>": 36070,
20
- "<extra_id_26>": 36069,
21
- "<extra_id_27>": 36068,
22
- "<extra_id_28>": 36067,
23
- "<extra_id_29>": 36066,
24
- "<extra_id_2>": 36093,
25
- "<extra_id_30>": 36065,
26
- "<extra_id_31>": 36064,
27
- "<extra_id_32>": 36063,
28
- "<extra_id_33>": 36062,
29
- "<extra_id_34>": 36061,
30
- "<extra_id_35>": 36060,
31
- "<extra_id_36>": 36059,
32
- "<extra_id_37>": 36058,
33
- "<extra_id_38>": 36057,
34
- "<extra_id_39>": 36056,
35
- "<extra_id_3>": 36092,
36
- "<extra_id_40>": 36055,
37
- "<extra_id_41>": 36054,
38
- "<extra_id_42>": 36053,
39
- "<extra_id_43>": 36052,
40
- "<extra_id_44>": 36051,
41
- "<extra_id_45>": 36050,
42
- "<extra_id_46>": 36049,
43
- "<extra_id_47>": 36048,
44
- "<extra_id_48>": 36047,
45
- "<extra_id_49>": 36046,
46
- "<extra_id_4>": 36091,
47
- "<extra_id_50>": 36045,
48
- "<extra_id_51>": 36044,
49
- "<extra_id_52>": 36043,
50
- "<extra_id_53>": 36042,
51
- "<extra_id_54>": 36041,
52
- "<extra_id_55>": 36040,
53
- "<extra_id_56>": 36039,
54
- "<extra_id_57>": 36038,
55
- "<extra_id_58>": 36037,
56
- "<extra_id_59>": 36036,
57
- "<extra_id_5>": 36090,
58
- "<extra_id_60>": 36035,
59
- "<extra_id_61>": 36034,
60
- "<extra_id_62>": 36033,
61
- "<extra_id_63>": 36032,
62
- "<extra_id_64>": 36031,
63
- "<extra_id_65>": 36030,
64
- "<extra_id_66>": 36029,
65
- "<extra_id_67>": 36028,
66
- "<extra_id_68>": 36027,
67
- "<extra_id_69>": 36026,
68
- "<extra_id_6>": 36089,
69
- "<extra_id_70>": 36025,
70
- "<extra_id_71>": 36024,
71
- "<extra_id_72>": 36023,
72
- "<extra_id_73>": 36022,
73
- "<extra_id_74>": 36021,
74
- "<extra_id_75>": 36020,
75
- "<extra_id_76>": 36019,
76
- "<extra_id_77>": 36018,
77
- "<extra_id_78>": 36017,
78
- "<extra_id_79>": 36016,
79
- "<extra_id_7>": 36088,
80
- "<extra_id_80>": 36015,
81
- "<extra_id_81>": 36014,
82
- "<extra_id_82>": 36013,
83
- "<extra_id_83>": 36012,
84
- "<extra_id_84>": 36011,
85
- "<extra_id_85>": 36010,
86
- "<extra_id_86>": 36009,
87
- "<extra_id_87>": 36008,
88
- "<extra_id_88>": 36007,
89
- "<extra_id_89>": 36006,
90
- "<extra_id_8>": 36087,
91
- "<extra_id_90>": 36005,
92
- "<extra_id_91>": 36004,
93
- "<extra_id_92>": 36003,
94
- "<extra_id_93>": 36002,
95
- "<extra_id_94>": 36001,
96
- "<extra_id_95>": 36000,
97
- "<extra_id_9>": 36086
98
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/added_tokens.json DELETED
@@ -1,98 +0,0 @@
1
- {
2
- "<extra_id_0>": 36095,
3
- "<extra_id_10>": 36085,
4
- "<extra_id_11>": 36084,
5
- "<extra_id_12>": 36083,
6
- "<extra_id_13>": 36082,
7
- "<extra_id_14>": 36081,
8
- "<extra_id_15>": 36080,
9
- "<extra_id_16>": 36079,
10
- "<extra_id_17>": 36078,
11
- "<extra_id_18>": 36077,
12
- "<extra_id_19>": 36076,
13
- "<extra_id_1>": 36094,
14
- "<extra_id_20>": 36075,
15
- "<extra_id_21>": 36074,
16
- "<extra_id_22>": 36073,
17
- "<extra_id_23>": 36072,
18
- "<extra_id_24>": 36071,
19
- "<extra_id_25>": 36070,
20
- "<extra_id_26>": 36069,
21
- "<extra_id_27>": 36068,
22
- "<extra_id_28>": 36067,
23
- "<extra_id_29>": 36066,
24
- "<extra_id_2>": 36093,
25
- "<extra_id_30>": 36065,
26
- "<extra_id_31>": 36064,
27
- "<extra_id_32>": 36063,
28
- "<extra_id_33>": 36062,
29
- "<extra_id_34>": 36061,
30
- "<extra_id_35>": 36060,
31
- "<extra_id_36>": 36059,
32
- "<extra_id_37>": 36058,
33
- "<extra_id_38>": 36057,
34
- "<extra_id_39>": 36056,
35
- "<extra_id_3>": 36092,
36
- "<extra_id_40>": 36055,
37
- "<extra_id_41>": 36054,
38
- "<extra_id_42>": 36053,
39
- "<extra_id_43>": 36052,
40
- "<extra_id_44>": 36051,
41
- "<extra_id_45>": 36050,
42
- "<extra_id_46>": 36049,
43
- "<extra_id_47>": 36048,
44
- "<extra_id_48>": 36047,
45
- "<extra_id_49>": 36046,
46
- "<extra_id_4>": 36091,
47
- "<extra_id_50>": 36045,
48
- "<extra_id_51>": 36044,
49
- "<extra_id_52>": 36043,
50
- "<extra_id_53>": 36042,
51
- "<extra_id_54>": 36041,
52
- "<extra_id_55>": 36040,
53
- "<extra_id_56>": 36039,
54
- "<extra_id_57>": 36038,
55
- "<extra_id_58>": 36037,
56
- "<extra_id_59>": 36036,
57
- "<extra_id_5>": 36090,
58
- "<extra_id_60>": 36035,
59
- "<extra_id_61>": 36034,
60
- "<extra_id_62>": 36033,
61
- "<extra_id_63>": 36032,
62
- "<extra_id_64>": 36031,
63
- "<extra_id_65>": 36030,
64
- "<extra_id_66>": 36029,
65
- "<extra_id_67>": 36028,
66
- "<extra_id_68>": 36027,
67
- "<extra_id_69>": 36026,
68
- "<extra_id_6>": 36089,
69
- "<extra_id_70>": 36025,
70
- "<extra_id_71>": 36024,
71
- "<extra_id_72>": 36023,
72
- "<extra_id_73>": 36022,
73
- "<extra_id_74>": 36021,
74
- "<extra_id_75>": 36020,
75
- "<extra_id_76>": 36019,
76
- "<extra_id_77>": 36018,
77
- "<extra_id_78>": 36017,
78
- "<extra_id_79>": 36016,
79
- "<extra_id_7>": 36088,
80
- "<extra_id_80>": 36015,
81
- "<extra_id_81>": 36014,
82
- "<extra_id_82>": 36013,
83
- "<extra_id_83>": 36012,
84
- "<extra_id_84>": 36011,
85
- "<extra_id_85>": 36010,
86
- "<extra_id_86>": 36009,
87
- "<extra_id_87>": 36008,
88
- "<extra_id_88>": 36007,
89
- "<extra_id_89>": 36006,
90
- "<extra_id_8>": 36087,
91
- "<extra_id_90>": 36005,
92
- "<extra_id_91>": 36004,
93
- "<extra_id_92>": 36003,
94
- "<extra_id_93>": 36002,
95
- "<extra_id_94>": 36001,
96
- "<extra_id_95>": 36000,
97
- "<extra_id_9>": 36086
98
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/config.json DELETED
@@ -1,31 +0,0 @@
1
- {
2
- "architectures": [
3
- "T5ForConditionalGeneration"
4
- ],
5
- "classifier_dropout": 0.0,
6
- "d_ff": 3072,
7
- "d_kv": 64,
8
- "d_model": 768,
9
- "decoder_start_token_id": 0,
10
- "dense_act_fn": "relu",
11
- "dropout_rate": 0.1,
12
- "eos_token_id": 1,
13
- "feed_forward_proj": "relu",
14
- "initializer_factor": 1.0,
15
- "is_encoder_decoder": true,
16
- "is_gated_act": false,
17
- "layer_norm_epsilon": 1e-06,
18
- "model_type": "t5",
19
- "n_positions": 512,
20
- "num_decoder_layers": 12,
21
- "num_heads": 12,
22
- "num_layers": 12,
23
- "output_past": true,
24
- "pad_token_id": 0,
25
- "relative_attention_max_distance": 128,
26
- "relative_attention_num_buckets": 32,
27
- "torch_dtype": "float32",
28
- "transformers_version": "4.55.2",
29
- "use_cache": true,
30
- "vocab_size": 36096
31
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/generation_config.json DELETED
@@ -1,7 +0,0 @@
1
- {
2
- "_from_model_config": true,
3
- "decoder_start_token_id": 0,
4
- "eos_token_id": 1,
5
- "pad_token_id": 0,
6
- "transformers_version": "4.55.2"
7
- }
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/rng_state.pth DELETED
Binary file (14.6 kB)
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/scheduler.pt DELETED
Binary file (1.47 kB)
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/special_tokens_map.json DELETED
@@ -1,121 +0,0 @@
1
- {
2
- "additional_special_tokens": [
3
- "<extra_id_0>",
4
- "<extra_id_1>",
5
- "<extra_id_2>",
6
- "<extra_id_3>",
7
- "<extra_id_4>",
8
- "<extra_id_5>",
9
- "<extra_id_6>",
10
- "<extra_id_7>",
11
- "<extra_id_8>",
12
- "<extra_id_9>",
13
- "<extra_id_10>",
14
- "<extra_id_11>",
15
- "<extra_id_12>",
16
- "<extra_id_13>",
17
- "<extra_id_14>",
18
- "<extra_id_15>",
19
- "<extra_id_16>",
20
- "<extra_id_17>",
21
- "<extra_id_18>",
22
- "<extra_id_19>",
23
- "<extra_id_20>",
24
- "<extra_id_21>",
25
- "<extra_id_22>",
26
- "<extra_id_23>",
27
- "<extra_id_24>",
28
- "<extra_id_25>",
29
- "<extra_id_26>",
30
- "<extra_id_27>",
31
- "<extra_id_28>",
32
- "<extra_id_29>",
33
- "<extra_id_30>",
34
- "<extra_id_31>",
35
- "<extra_id_32>",
36
- "<extra_id_33>",
37
- "<extra_id_34>",
38
- "<extra_id_35>",
39
- "<extra_id_36>",
40
- "<extra_id_37>",
41
- "<extra_id_38>",
42
- "<extra_id_39>",
43
- "<extra_id_40>",
44
- "<extra_id_41>",
45
- "<extra_id_42>",
46
- "<extra_id_43>",
47
- "<extra_id_44>",
48
- "<extra_id_45>",
49
- "<extra_id_46>",
50
- "<extra_id_47>",
51
- "<extra_id_48>",
52
- "<extra_id_49>",
53
- "<extra_id_50>",
54
- "<extra_id_51>",
55
- "<extra_id_52>",
56
- "<extra_id_53>",
57
- "<extra_id_54>",
58
- "<extra_id_55>",
59
- "<extra_id_56>",
60
- "<extra_id_57>",
61
- "<extra_id_58>",
62
- "<extra_id_59>",
63
- "<extra_id_60>",
64
- "<extra_id_61>",
65
- "<extra_id_62>",
66
- "<extra_id_63>",
67
- "<extra_id_64>",
68
- "<extra_id_65>",
69
- "<extra_id_66>",
70
- "<extra_id_67>",
71
- "<extra_id_68>",
72
- "<extra_id_69>",
73
- "<extra_id_70>",
74
- "<extra_id_71>",
75
- "<extra_id_72>",
76
- "<extra_id_73>",
77
- "<extra_id_74>",
78
- "<extra_id_75>",
79
- "<extra_id_76>",
80
- "<extra_id_77>",
81
- "<extra_id_78>",
82
- "<extra_id_79>",
83
- "<extra_id_80>",
84
- "<extra_id_81>",
85
- "<extra_id_82>",
86
- "<extra_id_83>",
87
- "<extra_id_84>",
88
- "<extra_id_85>",
89
- "<extra_id_86>",
90
- "<extra_id_87>",
91
- "<extra_id_88>",
92
- "<extra_id_89>",
93
- "<extra_id_90>",
94
- "<extra_id_91>",
95
- "<extra_id_92>",
96
- "<extra_id_93>",
97
- "<extra_id_94>",
98
- "<extra_id_95>"
99
- ],
100
- "eos_token": {
101
- "content": "</s>",
102
- "lstrip": false,
103
- "normalized": false,
104
- "rstrip": false,
105
- "single_word": false
106
- },
107
- "pad_token": {
108
- "content": "<pad>",
109
- "lstrip": false,
110
- "normalized": false,
111
- "rstrip": false,
112
- "single_word": false
113
- },
114
- "unk_token": {
115
- "content": "<unk>",
116
- "lstrip": false,
117
- "normalized": false,
118
- "rstrip": false,
119
- "single_word": false
120
- }
121
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/spiece.model DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:59986b62f9f0b90edafb9b073ea7b93d21114a5841219a1ea2399ade73f729c6
3
- size 820370
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/tokenizer_config.json DELETED
@@ -1,905 +0,0 @@
1
- {
2
- "add_prefix_space": true,
3
- "added_tokens_decoder": {
4
- "0": {
5
- "content": "<pad>",
6
- "lstrip": false,
7
- "normalized": false,
8
- "rstrip": false,
9
- "single_word": false,
10
- "special": true
11
- },
12
- "1": {
13
- "content": "</s>",
14
- "lstrip": false,
15
- "normalized": false,
16
- "rstrip": false,
17
- "single_word": false,
18
- "special": true
19
- },
20
- "2": {
21
- "content": "<unk>",
22
- "lstrip": false,
23
- "normalized": false,
24
- "rstrip": false,
25
- "single_word": false,
26
- "special": true
27
- },
28
- "36000": {
29
- "content": "<extra_id_95>",
30
- "lstrip": true,
31
- "normalized": false,
32
- "rstrip": true,
33
- "single_word": false,
34
- "special": true
35
- },
36
- "36001": {
37
- "content": "<extra_id_94>",
38
- "lstrip": true,
39
- "normalized": false,
40
- "rstrip": true,
41
- "single_word": false,
42
- "special": true
43
- },
44
- "36002": {
45
- "content": "<extra_id_93>",
46
- "lstrip": true,
47
- "normalized": false,
48
- "rstrip": true,
49
- "single_word": false,
50
- "special": true
51
- },
52
- "36003": {
53
- "content": "<extra_id_92>",
54
- "lstrip": true,
55
- "normalized": false,
56
- "rstrip": true,
57
- "single_word": false,
58
- "special": true
59
- },
60
- "36004": {
61
- "content": "<extra_id_91>",
62
- "lstrip": true,
63
- "normalized": false,
64
- "rstrip": true,
65
- "single_word": false,
66
- "special": true
67
- },
68
- "36005": {
69
- "content": "<extra_id_90>",
70
- "lstrip": true,
71
- "normalized": false,
72
- "rstrip": true,
73
- "single_word": false,
74
- "special": true
75
- },
76
- "36006": {
77
- "content": "<extra_id_89>",
78
- "lstrip": true,
79
- "normalized": false,
80
- "rstrip": true,
81
- "single_word": false,
82
- "special": true
83
- },
84
- "36007": {
85
- "content": "<extra_id_88>",
86
- "lstrip": true,
87
- "normalized": false,
88
- "rstrip": true,
89
- "single_word": false,
90
- "special": true
91
- },
92
- "36008": {
93
- "content": "<extra_id_87>",
94
- "lstrip": true,
95
- "normalized": false,
96
- "rstrip": true,
97
- "single_word": false,
98
- "special": true
99
- },
100
- "36009": {
101
- "content": "<extra_id_86>",
102
- "lstrip": true,
103
- "normalized": false,
104
- "rstrip": true,
105
- "single_word": false,
106
- "special": true
107
- },
108
- "36010": {
109
- "content": "<extra_id_85>",
110
- "lstrip": true,
111
- "normalized": false,
112
- "rstrip": true,
113
- "single_word": false,
114
- "special": true
115
- },
116
- "36011": {
117
- "content": "<extra_id_84>",
118
- "lstrip": true,
119
- "normalized": false,
120
- "rstrip": true,
121
- "single_word": false,
122
- "special": true
123
- },
124
- "36012": {
125
- "content": "<extra_id_83>",
126
- "lstrip": true,
127
- "normalized": false,
128
- "rstrip": true,
129
- "single_word": false,
130
- "special": true
131
- },
132
- "36013": {
133
- "content": "<extra_id_82>",
134
- "lstrip": true,
135
- "normalized": false,
136
- "rstrip": true,
137
- "single_word": false,
138
- "special": true
139
- },
140
- "36014": {
141
- "content": "<extra_id_81>",
142
- "lstrip": true,
143
- "normalized": false,
144
- "rstrip": true,
145
- "single_word": false,
146
- "special": true
147
- },
148
- "36015": {
149
- "content": "<extra_id_80>",
150
- "lstrip": true,
151
- "normalized": false,
152
- "rstrip": true,
153
- "single_word": false,
154
- "special": true
155
- },
156
- "36016": {
157
- "content": "<extra_id_79>",
158
- "lstrip": true,
159
- "normalized": false,
160
- "rstrip": true,
161
- "single_word": false,
162
- "special": true
163
- },
164
- "36017": {
165
- "content": "<extra_id_78>",
166
- "lstrip": true,
167
- "normalized": false,
168
- "rstrip": true,
169
- "single_word": false,
170
- "special": true
171
- },
172
- "36018": {
173
- "content": "<extra_id_77>",
174
- "lstrip": true,
175
- "normalized": false,
176
- "rstrip": true,
177
- "single_word": false,
178
- "special": true
179
- },
180
- "36019": {
181
- "content": "<extra_id_76>",
182
- "lstrip": true,
183
- "normalized": false,
184
- "rstrip": true,
185
- "single_word": false,
186
- "special": true
187
- },
188
- "36020": {
189
- "content": "<extra_id_75>",
190
- "lstrip": true,
191
- "normalized": false,
192
- "rstrip": true,
193
- "single_word": false,
194
- "special": true
195
- },
196
- "36021": {
197
- "content": "<extra_id_74>",
198
- "lstrip": true,
199
- "normalized": false,
200
- "rstrip": true,
201
- "single_word": false,
202
- "special": true
203
- },
204
- "36022": {
205
- "content": "<extra_id_73>",
206
- "lstrip": true,
207
- "normalized": false,
208
- "rstrip": true,
209
- "single_word": false,
210
- "special": true
211
- },
212
- "36023": {
213
- "content": "<extra_id_72>",
214
- "lstrip": true,
215
- "normalized": false,
216
- "rstrip": true,
217
- "single_word": false,
218
- "special": true
219
- },
220
- "36024": {
221
- "content": "<extra_id_71>",
222
- "lstrip": true,
223
- "normalized": false,
224
- "rstrip": true,
225
- "single_word": false,
226
- "special": true
227
- },
228
- "36025": {
229
- "content": "<extra_id_70>",
230
- "lstrip": true,
231
- "normalized": false,
232
- "rstrip": true,
233
- "single_word": false,
234
- "special": true
235
- },
236
- "36026": {
237
- "content": "<extra_id_69>",
238
- "lstrip": true,
239
- "normalized": false,
240
- "rstrip": true,
241
- "single_word": false,
242
- "special": true
243
- },
244
- "36027": {
245
- "content": "<extra_id_68>",
246
- "lstrip": true,
247
- "normalized": false,
248
- "rstrip": true,
249
- "single_word": false,
250
- "special": true
251
- },
252
- "36028": {
253
- "content": "<extra_id_67>",
254
- "lstrip": true,
255
- "normalized": false,
256
- "rstrip": true,
257
- "single_word": false,
258
- "special": true
259
- },
260
- "36029": {
261
- "content": "<extra_id_66>",
262
- "lstrip": true,
263
- "normalized": false,
264
- "rstrip": true,
265
- "single_word": false,
266
- "special": true
267
- },
268
- "36030": {
269
- "content": "<extra_id_65>",
270
- "lstrip": true,
271
- "normalized": false,
272
- "rstrip": true,
273
- "single_word": false,
274
- "special": true
275
- },
276
- "36031": {
277
- "content": "<extra_id_64>",
278
- "lstrip": true,
279
- "normalized": false,
280
- "rstrip": true,
281
- "single_word": false,
282
- "special": true
283
- },
284
- "36032": {
285
- "content": "<extra_id_63>",
286
- "lstrip": true,
287
- "normalized": false,
288
- "rstrip": true,
289
- "single_word": false,
290
- "special": true
291
- },
292
- "36033": {
293
- "content": "<extra_id_62>",
294
- "lstrip": true,
295
- "normalized": false,
296
- "rstrip": true,
297
- "single_word": false,
298
- "special": true
299
- },
300
- "36034": {
301
- "content": "<extra_id_61>",
302
- "lstrip": true,
303
- "normalized": false,
304
- "rstrip": true,
305
- "single_word": false,
306
- "special": true
307
- },
308
- "36035": {
309
- "content": "<extra_id_60>",
310
- "lstrip": true,
311
- "normalized": false,
312
- "rstrip": true,
313
- "single_word": false,
314
- "special": true
315
- },
316
- "36036": {
317
- "content": "<extra_id_59>",
318
- "lstrip": true,
319
- "normalized": false,
320
- "rstrip": true,
321
- "single_word": false,
322
- "special": true
323
- },
324
- "36037": {
325
- "content": "<extra_id_58>",
326
- "lstrip": true,
327
- "normalized": false,
328
- "rstrip": true,
329
- "single_word": false,
330
- "special": true
331
- },
332
- "36038": {
333
- "content": "<extra_id_57>",
334
- "lstrip": true,
335
- "normalized": false,
336
- "rstrip": true,
337
- "single_word": false,
338
- "special": true
339
- },
340
- "36039": {
341
- "content": "<extra_id_56>",
342
- "lstrip": true,
343
- "normalized": false,
344
- "rstrip": true,
345
- "single_word": false,
346
- "special": true
347
- },
348
- "36040": {
349
- "content": "<extra_id_55>",
350
- "lstrip": true,
351
- "normalized": false,
352
- "rstrip": true,
353
- "single_word": false,
354
- "special": true
355
- },
356
- "36041": {
357
- "content": "<extra_id_54>",
358
- "lstrip": true,
359
- "normalized": false,
360
- "rstrip": true,
361
- "single_word": false,
362
- "special": true
363
- },
364
- "36042": {
365
- "content": "<extra_id_53>",
366
- "lstrip": true,
367
- "normalized": false,
368
- "rstrip": true,
369
- "single_word": false,
370
- "special": true
371
- },
372
- "36043": {
373
- "content": "<extra_id_52>",
374
- "lstrip": true,
375
- "normalized": false,
376
- "rstrip": true,
377
- "single_word": false,
378
- "special": true
379
- },
380
- "36044": {
381
- "content": "<extra_id_51>",
382
- "lstrip": true,
383
- "normalized": false,
384
- "rstrip": true,
385
- "single_word": false,
386
- "special": true
387
- },
388
- "36045": {
389
- "content": "<extra_id_50>",
390
- "lstrip": true,
391
- "normalized": false,
392
- "rstrip": true,
393
- "single_word": false,
394
- "special": true
395
- },
396
- "36046": {
397
- "content": "<extra_id_49>",
398
- "lstrip": true,
399
- "normalized": false,
400
- "rstrip": true,
401
- "single_word": false,
402
- "special": true
403
- },
404
- "36047": {
405
- "content": "<extra_id_48>",
406
- "lstrip": true,
407
- "normalized": false,
408
- "rstrip": true,
409
- "single_word": false,
410
- "special": true
411
- },
412
- "36048": {
413
- "content": "<extra_id_47>",
414
- "lstrip": true,
415
- "normalized": false,
416
- "rstrip": true,
417
- "single_word": false,
418
- "special": true
419
- },
420
- "36049": {
421
- "content": "<extra_id_46>",
422
- "lstrip": true,
423
- "normalized": false,
424
- "rstrip": true,
425
- "single_word": false,
426
- "special": true
427
- },
428
- "36050": {
429
- "content": "<extra_id_45>",
430
- "lstrip": true,
431
- "normalized": false,
432
- "rstrip": true,
433
- "single_word": false,
434
- "special": true
435
- },
436
- "36051": {
437
- "content": "<extra_id_44>",
438
- "lstrip": true,
439
- "normalized": false,
440
- "rstrip": true,
441
- "single_word": false,
442
- "special": true
443
- },
444
- "36052": {
445
- "content": "<extra_id_43>",
446
- "lstrip": true,
447
- "normalized": false,
448
- "rstrip": true,
449
- "single_word": false,
450
- "special": true
451
- },
452
- "36053": {
453
- "content": "<extra_id_42>",
454
- "lstrip": true,
455
- "normalized": false,
456
- "rstrip": true,
457
- "single_word": false,
458
- "special": true
459
- },
460
- "36054": {
461
- "content": "<extra_id_41>",
462
- "lstrip": true,
463
- "normalized": false,
464
- "rstrip": true,
465
- "single_word": false,
466
- "special": true
467
- },
468
- "36055": {
469
- "content": "<extra_id_40>",
470
- "lstrip": true,
471
- "normalized": false,
472
- "rstrip": true,
473
- "single_word": false,
474
- "special": true
475
- },
476
- "36056": {
477
- "content": "<extra_id_39>",
478
- "lstrip": true,
479
- "normalized": false,
480
- "rstrip": true,
481
- "single_word": false,
482
- "special": true
483
- },
484
- "36057": {
485
- "content": "<extra_id_38>",
486
- "lstrip": true,
487
- "normalized": false,
488
- "rstrip": true,
489
- "single_word": false,
490
- "special": true
491
- },
492
- "36058": {
493
- "content": "<extra_id_37>",
494
- "lstrip": true,
495
- "normalized": false,
496
- "rstrip": true,
497
- "single_word": false,
498
- "special": true
499
- },
500
- "36059": {
501
- "content": "<extra_id_36>",
502
- "lstrip": true,
503
- "normalized": false,
504
- "rstrip": true,
505
- "single_word": false,
506
- "special": true
507
- },
508
- "36060": {
509
- "content": "<extra_id_35>",
510
- "lstrip": true,
511
- "normalized": false,
512
- "rstrip": true,
513
- "single_word": false,
514
- "special": true
515
- },
516
- "36061": {
517
- "content": "<extra_id_34>",
518
- "lstrip": true,
519
- "normalized": false,
520
- "rstrip": true,
521
- "single_word": false,
522
- "special": true
523
- },
524
- "36062": {
525
- "content": "<extra_id_33>",
526
- "lstrip": true,
527
- "normalized": false,
528
- "rstrip": true,
529
- "single_word": false,
530
- "special": true
531
- },
532
- "36063": {
533
- "content": "<extra_id_32>",
534
- "lstrip": true,
535
- "normalized": false,
536
- "rstrip": true,
537
- "single_word": false,
538
- "special": true
539
- },
540
- "36064": {
541
- "content": "<extra_id_31>",
542
- "lstrip": true,
543
- "normalized": false,
544
- "rstrip": true,
545
- "single_word": false,
546
- "special": true
547
- },
548
- "36065": {
549
- "content": "<extra_id_30>",
550
- "lstrip": true,
551
- "normalized": false,
552
- "rstrip": true,
553
- "single_word": false,
554
- "special": true
555
- },
556
- "36066": {
557
- "content": "<extra_id_29>",
558
- "lstrip": true,
559
- "normalized": false,
560
- "rstrip": true,
561
- "single_word": false,
562
- "special": true
563
- },
564
- "36067": {
565
- "content": "<extra_id_28>",
566
- "lstrip": true,
567
- "normalized": false,
568
- "rstrip": true,
569
- "single_word": false,
570
- "special": true
571
- },
572
- "36068": {
573
- "content": "<extra_id_27>",
574
- "lstrip": true,
575
- "normalized": false,
576
- "rstrip": true,
577
- "single_word": false,
578
- "special": true
579
- },
580
- "36069": {
581
- "content": "<extra_id_26>",
582
- "lstrip": true,
583
- "normalized": false,
584
- "rstrip": true,
585
- "single_word": false,
586
- "special": true
587
- },
588
- "36070": {
589
- "content": "<extra_id_25>",
590
- "lstrip": true,
591
- "normalized": false,
592
- "rstrip": true,
593
- "single_word": false,
594
- "special": true
595
- },
596
- "36071": {
597
- "content": "<extra_id_24>",
598
- "lstrip": true,
599
- "normalized": false,
600
- "rstrip": true,
601
- "single_word": false,
602
- "special": true
603
- },
604
- "36072": {
605
- "content": "<extra_id_23>",
606
- "lstrip": true,
607
- "normalized": false,
608
- "rstrip": true,
609
- "single_word": false,
610
- "special": true
611
- },
612
- "36073": {
613
- "content": "<extra_id_22>",
614
- "lstrip": true,
615
- "normalized": false,
616
- "rstrip": true,
617
- "single_word": false,
618
- "special": true
619
- },
620
- "36074": {
621
- "content": "<extra_id_21>",
622
- "lstrip": true,
623
- "normalized": false,
624
- "rstrip": true,
625
- "single_word": false,
626
- "special": true
627
- },
628
- "36075": {
629
- "content": "<extra_id_20>",
630
- "lstrip": true,
631
- "normalized": false,
632
- "rstrip": true,
633
- "single_word": false,
634
- "special": true
635
- },
636
- "36076": {
637
- "content": "<extra_id_19>",
638
- "lstrip": true,
639
- "normalized": false,
640
- "rstrip": true,
641
- "single_word": false,
642
- "special": true
643
- },
644
- "36077": {
645
- "content": "<extra_id_18>",
646
- "lstrip": true,
647
- "normalized": false,
648
- "rstrip": true,
649
- "single_word": false,
650
- "special": true
651
- },
652
- "36078": {
653
- "content": "<extra_id_17>",
654
- "lstrip": true,
655
- "normalized": false,
656
- "rstrip": true,
657
- "single_word": false,
658
- "special": true
659
- },
660
- "36079": {
661
- "content": "<extra_id_16>",
662
- "lstrip": true,
663
- "normalized": false,
664
- "rstrip": true,
665
- "single_word": false,
666
- "special": true
667
- },
668
- "36080": {
669
- "content": "<extra_id_15>",
670
- "lstrip": true,
671
- "normalized": false,
672
- "rstrip": true,
673
- "single_word": false,
674
- "special": true
675
- },
676
- "36081": {
677
- "content": "<extra_id_14>",
678
- "lstrip": true,
679
- "normalized": false,
680
- "rstrip": true,
681
- "single_word": false,
682
- "special": true
683
- },
684
- "36082": {
685
- "content": "<extra_id_13>",
686
- "lstrip": true,
687
- "normalized": false,
688
- "rstrip": true,
689
- "single_word": false,
690
- "special": true
691
- },
692
- "36083": {
693
- "content": "<extra_id_12>",
694
- "lstrip": true,
695
- "normalized": false,
696
- "rstrip": true,
697
- "single_word": false,
698
- "special": true
699
- },
700
- "36084": {
701
- "content": "<extra_id_11>",
702
- "lstrip": true,
703
- "normalized": false,
704
- "rstrip": true,
705
- "single_word": false,
706
- "special": true
707
- },
708
- "36085": {
709
- "content": "<extra_id_10>",
710
- "lstrip": true,
711
- "normalized": false,
712
- "rstrip": true,
713
- "single_word": false,
714
- "special": true
715
- },
716
- "36086": {
717
- "content": "<extra_id_9>",
718
- "lstrip": true,
719
- "normalized": false,
720
- "rstrip": true,
721
- "single_word": false,
722
- "special": true
723
- },
724
- "36087": {
725
- "content": "<extra_id_8>",
726
- "lstrip": true,
727
- "normalized": false,
728
- "rstrip": true,
729
- "single_word": false,
730
- "special": true
731
- },
732
- "36088": {
733
- "content": "<extra_id_7>",
734
- "lstrip": true,
735
- "normalized": false,
736
- "rstrip": true,
737
- "single_word": false,
738
- "special": true
739
- },
740
- "36089": {
741
- "content": "<extra_id_6>",
742
- "lstrip": true,
743
- "normalized": false,
744
- "rstrip": true,
745
- "single_word": false,
746
- "special": true
747
- },
748
- "36090": {
749
- "content": "<extra_id_5>",
750
- "lstrip": true,
751
- "normalized": false,
752
- "rstrip": true,
753
- "single_word": false,
754
- "special": true
755
- },
756
- "36091": {
757
- "content": "<extra_id_4>",
758
- "lstrip": true,
759
- "normalized": false,
760
- "rstrip": true,
761
- "single_word": false,
762
- "special": true
763
- },
764
- "36092": {
765
- "content": "<extra_id_3>",
766
- "lstrip": true,
767
- "normalized": false,
768
- "rstrip": true,
769
- "single_word": false,
770
- "special": true
771
- },
772
- "36093": {
773
- "content": "<extra_id_2>",
774
- "lstrip": true,
775
- "normalized": false,
776
- "rstrip": true,
777
- "single_word": false,
778
- "special": true
779
- },
780
- "36094": {
781
- "content": "<extra_id_1>",
782
- "lstrip": true,
783
- "normalized": false,
784
- "rstrip": true,
785
- "single_word": false,
786
- "special": true
787
- },
788
- "36095": {
789
- "content": "<extra_id_0>",
790
- "lstrip": true,
791
- "normalized": false,
792
- "rstrip": true,
793
- "single_word": false,
794
- "special": true
795
- }
796
- },
797
- "additional_special_tokens": [
798
- "<extra_id_0>",
799
- "<extra_id_1>",
800
- "<extra_id_2>",
801
- "<extra_id_3>",
802
- "<extra_id_4>",
803
- "<extra_id_5>",
804
- "<extra_id_6>",
805
- "<extra_id_7>",
806
- "<extra_id_8>",
807
- "<extra_id_9>",
808
- "<extra_id_10>",
809
- "<extra_id_11>",
810
- "<extra_id_12>",
811
- "<extra_id_13>",
812
- "<extra_id_14>",
813
- "<extra_id_15>",
814
- "<extra_id_16>",
815
- "<extra_id_17>",
816
- "<extra_id_18>",
817
- "<extra_id_19>",
818
- "<extra_id_20>",
819
- "<extra_id_21>",
820
- "<extra_id_22>",
821
- "<extra_id_23>",
822
- "<extra_id_24>",
823
- "<extra_id_25>",
824
- "<extra_id_26>",
825
- "<extra_id_27>",
826
- "<extra_id_28>",
827
- "<extra_id_29>",
828
- "<extra_id_30>",
829
- "<extra_id_31>",
830
- "<extra_id_32>",
831
- "<extra_id_33>",
832
- "<extra_id_34>",
833
- "<extra_id_35>",
834
- "<extra_id_36>",
835
- "<extra_id_37>",
836
- "<extra_id_38>",
837
- "<extra_id_39>",
838
- "<extra_id_40>",
839
- "<extra_id_41>",
840
- "<extra_id_42>",
841
- "<extra_id_43>",
842
- "<extra_id_44>",
843
- "<extra_id_45>",
844
- "<extra_id_46>",
845
- "<extra_id_47>",
846
- "<extra_id_48>",
847
- "<extra_id_49>",
848
- "<extra_id_50>",
849
- "<extra_id_51>",
850
- "<extra_id_52>",
851
- "<extra_id_53>",
852
- "<extra_id_54>",
853
- "<extra_id_55>",
854
- "<extra_id_56>",
855
- "<extra_id_57>",
856
- "<extra_id_58>",
857
- "<extra_id_59>",
858
- "<extra_id_60>",
859
- "<extra_id_61>",
860
- "<extra_id_62>",
861
- "<extra_id_63>",
862
- "<extra_id_64>",
863
- "<extra_id_65>",
864
- "<extra_id_66>",
865
- "<extra_id_67>",
866
- "<extra_id_68>",
867
- "<extra_id_69>",
868
- "<extra_id_70>",
869
- "<extra_id_71>",
870
- "<extra_id_72>",
871
- "<extra_id_73>",
872
- "<extra_id_74>",
873
- "<extra_id_75>",
874
- "<extra_id_76>",
875
- "<extra_id_77>",
876
- "<extra_id_78>",
877
- "<extra_id_79>",
878
- "<extra_id_80>",
879
- "<extra_id_81>",
880
- "<extra_id_82>",
881
- "<extra_id_83>",
882
- "<extra_id_84>",
883
- "<extra_id_85>",
884
- "<extra_id_86>",
885
- "<extra_id_87>",
886
- "<extra_id_88>",
887
- "<extra_id_89>",
888
- "<extra_id_90>",
889
- "<extra_id_91>",
890
- "<extra_id_92>",
891
- "<extra_id_93>",
892
- "<extra_id_94>",
893
- "<extra_id_95>"
894
- ],
895
- "clean_up_tokenization_spaces": false,
896
- "eos_token": "</s>",
897
- "extra_ids": 96,
898
- "extra_special_tokens": {},
899
- "legacy": true,
900
- "model_max_length": 1000000000000000019884624838656,
901
- "pad_token": "<pad>",
902
- "sp_model_kwargs": {},
903
- "tokenizer_class": "T5Tokenizer",
904
- "unk_token": "<unk>"
905
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/trainer_state.json DELETED
@@ -1,83 +0,0 @@
1
- {
2
- "best_global_step": null,
3
- "best_metric": null,
4
- "best_model_checkpoint": null,
5
- "epoch": 3.0,
6
- "eval_steps": 500,
7
- "global_step": 72,
8
- "is_hyper_param_search": false,
9
- "is_local_process_zero": true,
10
- "is_world_process_zero": true,
11
- "log_history": [
12
- {
13
- "epoch": 0.4166666666666667,
14
- "grad_norm": 8.893394470214844,
15
- "learning_rate": 0.000175,
16
- "loss": 4.5129,
17
- "step": 10
18
- },
19
- {
20
- "epoch": 0.8333333333333334,
21
- "grad_norm": 7.485191822052002,
22
- "learning_rate": 0.00014722222222222223,
23
- "loss": 0.7966,
24
- "step": 20
25
- },
26
- {
27
- "epoch": 1.25,
28
- "grad_norm": 11.92349910736084,
29
- "learning_rate": 0.00011944444444444445,
30
- "loss": 0.6588,
31
- "step": 30
32
- },
33
- {
34
- "epoch": 1.6666666666666665,
35
- "grad_norm": 1.6065994501113892,
36
- "learning_rate": 9.166666666666667e-05,
37
- "loss": 0.3657,
38
- "step": 40
39
- },
40
- {
41
- "epoch": 2.0833333333333335,
42
- "grad_norm": 8.219528198242188,
43
- "learning_rate": 6.388888888888888e-05,
44
- "loss": 0.2853,
45
- "step": 50
46
- },
47
- {
48
- "epoch": 2.5,
49
- "grad_norm": 1.018286108970642,
50
- "learning_rate": 3.611111111111111e-05,
51
- "loss": 0.1668,
52
- "step": 60
53
- },
54
- {
55
- "epoch": 2.9166666666666665,
56
- "grad_norm": 0.8563699126243591,
57
- "learning_rate": 8.333333333333334e-06,
58
- "loss": 0.1875,
59
- "step": 70
60
- }
61
- ],
62
- "logging_steps": 10,
63
- "max_steps": 72,
64
- "num_input_tokens_seen": 0,
65
- "num_train_epochs": 3,
66
- "save_steps": 500,
67
- "stateful_callbacks": {
68
- "TrainerControl": {
69
- "args": {
70
- "should_epoch_stop": false,
71
- "should_evaluate": false,
72
- "should_log": false,
73
- "should_save": true,
74
- "should_training_stop": true
75
- },
76
- "attributes": {}
77
- }
78
- },
79
- "total_flos": 43844968120320.0,
80
- "train_batch_size": 1,
81
- "trial_name": null,
82
- "trial_params": null
83
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/checkpoint-72/training_args.bin DELETED
Binary file (5.71 kB)
 
HVU_QA/t5-viet-qg-finetuned/config.json DELETED
@@ -1,31 +0,0 @@
1
- {
2
- "architectures": [
3
- "T5ForConditionalGeneration"
4
- ],
5
- "classifier_dropout": 0.0,
6
- "d_ff": 3072,
7
- "d_kv": 64,
8
- "d_model": 768,
9
- "decoder_start_token_id": 0,
10
- "dense_act_fn": "relu",
11
- "dropout_rate": 0.1,
12
- "eos_token_id": 1,
13
- "feed_forward_proj": "relu",
14
- "initializer_factor": 1.0,
15
- "is_encoder_decoder": true,
16
- "is_gated_act": false,
17
- "layer_norm_epsilon": 1e-06,
18
- "model_type": "t5",
19
- "n_positions": 512,
20
- "num_decoder_layers": 12,
21
- "num_heads": 12,
22
- "num_layers": 12,
23
- "output_past": true,
24
- "pad_token_id": 0,
25
- "relative_attention_max_distance": 128,
26
- "relative_attention_num_buckets": 32,
27
- "torch_dtype": "float32",
28
- "transformers_version": "4.55.2",
29
- "use_cache": true,
30
- "vocab_size": 36096
31
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/generation_config.json DELETED
@@ -1,7 +0,0 @@
1
- {
2
- "_from_model_config": true,
3
- "decoder_start_token_id": 0,
4
- "eos_token_id": 1,
5
- "pad_token_id": 0,
6
- "transformers_version": "4.55.2"
7
- }
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/special_tokens_map.json DELETED
@@ -1,121 +0,0 @@
1
- {
2
- "additional_special_tokens": [
3
- "<extra_id_0>",
4
- "<extra_id_1>",
5
- "<extra_id_2>",
6
- "<extra_id_3>",
7
- "<extra_id_4>",
8
- "<extra_id_5>",
9
- "<extra_id_6>",
10
- "<extra_id_7>",
11
- "<extra_id_8>",
12
- "<extra_id_9>",
13
- "<extra_id_10>",
14
- "<extra_id_11>",
15
- "<extra_id_12>",
16
- "<extra_id_13>",
17
- "<extra_id_14>",
18
- "<extra_id_15>",
19
- "<extra_id_16>",
20
- "<extra_id_17>",
21
- "<extra_id_18>",
22
- "<extra_id_19>",
23
- "<extra_id_20>",
24
- "<extra_id_21>",
25
- "<extra_id_22>",
26
- "<extra_id_23>",
27
- "<extra_id_24>",
28
- "<extra_id_25>",
29
- "<extra_id_26>",
30
- "<extra_id_27>",
31
- "<extra_id_28>",
32
- "<extra_id_29>",
33
- "<extra_id_30>",
34
- "<extra_id_31>",
35
- "<extra_id_32>",
36
- "<extra_id_33>",
37
- "<extra_id_34>",
38
- "<extra_id_35>",
39
- "<extra_id_36>",
40
- "<extra_id_37>",
41
- "<extra_id_38>",
42
- "<extra_id_39>",
43
- "<extra_id_40>",
44
- "<extra_id_41>",
45
- "<extra_id_42>",
46
- "<extra_id_43>",
47
- "<extra_id_44>",
48
- "<extra_id_45>",
49
- "<extra_id_46>",
50
- "<extra_id_47>",
51
- "<extra_id_48>",
52
- "<extra_id_49>",
53
- "<extra_id_50>",
54
- "<extra_id_51>",
55
- "<extra_id_52>",
56
- "<extra_id_53>",
57
- "<extra_id_54>",
58
- "<extra_id_55>",
59
- "<extra_id_56>",
60
- "<extra_id_57>",
61
- "<extra_id_58>",
62
- "<extra_id_59>",
63
- "<extra_id_60>",
64
- "<extra_id_61>",
65
- "<extra_id_62>",
66
- "<extra_id_63>",
67
- "<extra_id_64>",
68
- "<extra_id_65>",
69
- "<extra_id_66>",
70
- "<extra_id_67>",
71
- "<extra_id_68>",
72
- "<extra_id_69>",
73
- "<extra_id_70>",
74
- "<extra_id_71>",
75
- "<extra_id_72>",
76
- "<extra_id_73>",
77
- "<extra_id_74>",
78
- "<extra_id_75>",
79
- "<extra_id_76>",
80
- "<extra_id_77>",
81
- "<extra_id_78>",
82
- "<extra_id_79>",
83
- "<extra_id_80>",
84
- "<extra_id_81>",
85
- "<extra_id_82>",
86
- "<extra_id_83>",
87
- "<extra_id_84>",
88
- "<extra_id_85>",
89
- "<extra_id_86>",
90
- "<extra_id_87>",
91
- "<extra_id_88>",
92
- "<extra_id_89>",
93
- "<extra_id_90>",
94
- "<extra_id_91>",
95
- "<extra_id_92>",
96
- "<extra_id_93>",
97
- "<extra_id_94>",
98
- "<extra_id_95>"
99
- ],
100
- "eos_token": {
101
- "content": "</s>",
102
- "lstrip": false,
103
- "normalized": false,
104
- "rstrip": false,
105
- "single_word": false
106
- },
107
- "pad_token": {
108
- "content": "<pad>",
109
- "lstrip": false,
110
- "normalized": false,
111
- "rstrip": false,
112
- "single_word": false
113
- },
114
- "unk_token": {
115
- "content": "<unk>",
116
- "lstrip": false,
117
- "normalized": false,
118
- "rstrip": false,
119
- "single_word": false
120
- }
121
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/spiece.model DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:59986b62f9f0b90edafb9b073ea7b93d21114a5841219a1ea2399ade73f729c6
3
- size 820370
 
 
 
 
HVU_QA/t5-viet-qg-finetuned/tokenizer_config.json DELETED
@@ -1,905 +0,0 @@
1
- {
2
- "add_prefix_space": true,
3
- "added_tokens_decoder": {
4
- "0": {
5
- "content": "<pad>",
6
- "lstrip": false,
7
- "normalized": false,
8
- "rstrip": false,
9
- "single_word": false,
10
- "special": true
11
- },
12
- "1": {
13
- "content": "</s>",
14
- "lstrip": false,
15
- "normalized": false,
16
- "rstrip": false,
17
- "single_word": false,
18
- "special": true
19
- },
20
- "2": {
21
- "content": "<unk>",
22
- "lstrip": false,
23
- "normalized": false,
24
- "rstrip": false,
25
- "single_word": false,
26
- "special": true
27
- },
28
- "36000": {
29
- "content": "<extra_id_95>",
30
- "lstrip": true,
31
- "normalized": false,
32
- "rstrip": true,
33
- "single_word": false,
34
- "special": true
35
- },
36
- "36001": {
37
- "content": "<extra_id_94>",
38
- "lstrip": true,
39
- "normalized": false,
40
- "rstrip": true,
41
- "single_word": false,
42
- "special": true
43
- },
44
- "36002": {
45
- "content": "<extra_id_93>",
46
- "lstrip": true,
47
- "normalized": false,
48
- "rstrip": true,
49
- "single_word": false,
50
- "special": true
51
- },
52
- "36003": {
53
- "content": "<extra_id_92>",
54
- "lstrip": true,
55
- "normalized": false,
56
- "rstrip": true,
57
- "single_word": false,
58
- "special": true
59
- },
60
- "36004": {
61
- "content": "<extra_id_91>",
62
- "lstrip": true,
63
- "normalized": false,
64
- "rstrip": true,
65
- "single_word": false,
66
- "special": true
67
- },
68
- "36005": {
69
- "content": "<extra_id_90>",
70
- "lstrip": true,
71
- "normalized": false,
72
- "rstrip": true,
73
- "single_word": false,
74
- "special": true
75
- },
76
- "36006": {
77
- "content": "<extra_id_89>",
78
- "lstrip": true,
79
- "normalized": false,
80
- "rstrip": true,
81
- "single_word": false,
82
- "special": true
83
- },
84
- "36007": {
85
- "content": "<extra_id_88>",
86
- "lstrip": true,
87
- "normalized": false,
88
- "rstrip": true,
89
- "single_word": false,
90
- "special": true
91
- },
92
- "36008": {
93
- "content": "<extra_id_87>",
94
- "lstrip": true,
95
- "normalized": false,
96
- "rstrip": true,
97
- "single_word": false,
98
- "special": true
99
- },
100
- "36009": {
101
- "content": "<extra_id_86>",
102
- "lstrip": true,
103
- "normalized": false,
104
- "rstrip": true,
105
- "single_word": false,
106
- "special": true
107
- },
108
- "36010": {
109
- "content": "<extra_id_85>",
110
- "lstrip": true,
111
- "normalized": false,
112
- "rstrip": true,
113
- "single_word": false,
114
- "special": true
115
- },
116
- "36011": {
117
- "content": "<extra_id_84>",
118
- "lstrip": true,
119
- "normalized": false,
120
- "rstrip": true,
121
- "single_word": false,
122
- "special": true
123
- },
124
- "36012": {
125
- "content": "<extra_id_83>",
126
- "lstrip": true,
127
- "normalized": false,
128
- "rstrip": true,
129
- "single_word": false,
130
- "special": true
131
- },
132
- "36013": {
133
- "content": "<extra_id_82>",
134
- "lstrip": true,
135
- "normalized": false,
136
- "rstrip": true,
137
- "single_word": false,
138
- "special": true
139
- },
140
- "36014": {
141
- "content": "<extra_id_81>",
142
- "lstrip": true,
143
- "normalized": false,
144
- "rstrip": true,
145
- "single_word": false,
146
- "special": true
147
- },
148
- "36015": {
149
- "content": "<extra_id_80>",
150
- "lstrip": true,
151
- "normalized": false,
152
- "rstrip": true,
153
- "single_word": false,
154
- "special": true
155
- },
156
- "36016": {
157
- "content": "<extra_id_79>",
158
- "lstrip": true,
159
- "normalized": false,
160
- "rstrip": true,
161
- "single_word": false,
162
- "special": true
163
- },
164
- "36017": {
165
- "content": "<extra_id_78>",
166
- "lstrip": true,
167
- "normalized": false,
168
- "rstrip": true,
169
- "single_word": false,
170
- "special": true
171
- },
172
- "36018": {
173
- "content": "<extra_id_77>",
174
- "lstrip": true,
175
- "normalized": false,
176
- "rstrip": true,
177
- "single_word": false,
178
- "special": true
179
- },
180
- "36019": {
181
- "content": "<extra_id_76>",
182
- "lstrip": true,
183
- "normalized": false,
184
- "rstrip": true,
185
- "single_word": false,
186
- "special": true
187
- },
188
- "36020": {
189
- "content": "<extra_id_75>",
190
- "lstrip": true,
191
- "normalized": false,
192
- "rstrip": true,
193
- "single_word": false,
194
- "special": true
195
- },
196
- "36021": {
197
- "content": "<extra_id_74>",
198
- "lstrip": true,
199
- "normalized": false,
200
- "rstrip": true,
201
- "single_word": false,
202
- "special": true
203
- },
204
- "36022": {
205
- "content": "<extra_id_73>",
206
- "lstrip": true,
207
- "normalized": false,
208
- "rstrip": true,
209
- "single_word": false,
210
- "special": true
211
- },
212
- "36023": {
213
- "content": "<extra_id_72>",
214
- "lstrip": true,
215
- "normalized": false,
216
- "rstrip": true,
217
- "single_word": false,
218
- "special": true
219
- },
220
- "36024": {
221
- "content": "<extra_id_71>",
222
- "lstrip": true,
223
- "normalized": false,
224
- "rstrip": true,
225
- "single_word": false,
226
- "special": true
227
- },
228
- "36025": {
229
- "content": "<extra_id_70>",
230
- "lstrip": true,
231
- "normalized": false,
232
- "rstrip": true,
233
- "single_word": false,
234
- "special": true
235
- },
236
- "36026": {
237
- "content": "<extra_id_69>",
238
- "lstrip": true,
239
- "normalized": false,
240
- "rstrip": true,
241
- "single_word": false,
242
- "special": true
243
- },
244
- "36027": {
245
- "content": "<extra_id_68>",
246
- "lstrip": true,
247
- "normalized": false,
248
- "rstrip": true,
249
- "single_word": false,
250
- "special": true
251
- },
252
- "36028": {
253
- "content": "<extra_id_67>",
254
- "lstrip": true,
255
- "normalized": false,
256
- "rstrip": true,
257
- "single_word": false,
258
- "special": true
259
- },
260
- "36029": {
261
- "content": "<extra_id_66>",
262
- "lstrip": true,
263
- "normalized": false,
264
- "rstrip": true,
265
- "single_word": false,
266
- "special": true
267
- },
268
- "36030": {
269
- "content": "<extra_id_65>",
270
- "lstrip": true,
271
- "normalized": false,
272
- "rstrip": true,
273
- "single_word": false,
274
- "special": true
275
- },
276
- "36031": {
277
- "content": "<extra_id_64>",
278
- "lstrip": true,
279
- "normalized": false,
280
- "rstrip": true,
281
- "single_word": false,
282
- "special": true
283
- },
284
- "36032": {
285
- "content": "<extra_id_63>",
286
- "lstrip": true,
287
- "normalized": false,
288
- "rstrip": true,
289
- "single_word": false,
290
- "special": true
291
- },
292
- "36033": {
293
- "content": "<extra_id_62>",
294
- "lstrip": true,
295
- "normalized": false,
296
- "rstrip": true,
297
- "single_word": false,
298
- "special": true
299
- },
300
- "36034": {
301
- "content": "<extra_id_61>",
302
- "lstrip": true,
303
- "normalized": false,
304
- "rstrip": true,
305
- "single_word": false,
306
- "special": true
307
- },
308
- "36035": {
309
- "content": "<extra_id_60>",
310
- "lstrip": true,
311
- "normalized": false,
312
- "rstrip": true,
313
- "single_word": false,
314
- "special": true
315
- },
316
- "36036": {
317
- "content": "<extra_id_59>",
318
- "lstrip": true,
319
- "normalized": false,
320
- "rstrip": true,
321
- "single_word": false,
322
- "special": true
323
- },
324
- "36037": {
325
- "content": "<extra_id_58>",
326
- "lstrip": true,
327
- "normalized": false,
328
- "rstrip": true,
329
- "single_word": false,
330
- "special": true
331
- },
332
- "36038": {
333
- "content": "<extra_id_57>",
334
- "lstrip": true,
335
- "normalized": false,
336
- "rstrip": true,
337
- "single_word": false,
338
- "special": true
339
- },
340
- "36039": {
341
- "content": "<extra_id_56>",
342
- "lstrip": true,
343
- "normalized": false,
344
- "rstrip": true,
345
- "single_word": false,
346
- "special": true
347
- },
348
- "36040": {
349
- "content": "<extra_id_55>",
350
- "lstrip": true,
351
- "normalized": false,
352
- "rstrip": true,
353
- "single_word": false,
354
- "special": true
355
- },
356
- "36041": {
357
- "content": "<extra_id_54>",
358
- "lstrip": true,
359
- "normalized": false,
360
- "rstrip": true,
361
- "single_word": false,
362
- "special": true
363
- },
364
- "36042": {
365
- "content": "<extra_id_53>",
366
- "lstrip": true,
367
- "normalized": false,
368
- "rstrip": true,
369
- "single_word": false,
370
- "special": true
371
- },
372
- "36043": {
373
- "content": "<extra_id_52>",
374
- "lstrip": true,
375
- "normalized": false,
376
- "rstrip": true,
377
- "single_word": false,
378
- "special": true
379
- },
380
- "36044": {
381
- "content": "<extra_id_51>",
382
- "lstrip": true,
383
- "normalized": false,
384
- "rstrip": true,
385
- "single_word": false,
386
- "special": true
387
- },
388
- "36045": {
389
- "content": "<extra_id_50>",
390
- "lstrip": true,
391
- "normalized": false,
392
- "rstrip": true,
393
- "single_word": false,
394
- "special": true
395
- },
396
- "36046": {
397
- "content": "<extra_id_49>",
398
- "lstrip": true,
399
- "normalized": false,
400
- "rstrip": true,
401
- "single_word": false,
402
- "special": true
403
- },
404
- "36047": {
405
- "content": "<extra_id_48>",
406
- "lstrip": true,
407
- "normalized": false,
408
- "rstrip": true,
409
- "single_word": false,
410
- "special": true
411
- },
412
- "36048": {
413
- "content": "<extra_id_47>",
414
- "lstrip": true,
415
- "normalized": false,
416
- "rstrip": true,
417
- "single_word": false,
418
- "special": true
419
- },
420
- "36049": {
421
- "content": "<extra_id_46>",
422
- "lstrip": true,
423
- "normalized": false,
424
- "rstrip": true,
425
- "single_word": false,
426
- "special": true
427
- },
428
- "36050": {
429
- "content": "<extra_id_45>",
430
- "lstrip": true,
431
- "normalized": false,
432
- "rstrip": true,
433
- "single_word": false,
434
- "special": true
435
- },
436
- "36051": {
437
- "content": "<extra_id_44>",
438
- "lstrip": true,
439
- "normalized": false,
440
- "rstrip": true,
441
- "single_word": false,
442
- "special": true
443
- },
444
- "36052": {
445
- "content": "<extra_id_43>",
446
- "lstrip": true,
447
- "normalized": false,
448
- "rstrip": true,
449
- "single_word": false,
450
- "special": true
451
- },
452
- "36053": {
453
- "content": "<extra_id_42>",
454
- "lstrip": true,
455
- "normalized": false,
456
- "rstrip": true,
457
- "single_word": false,
458
- "special": true
459
- },
460
- "36054": {
461
- "content": "<extra_id_41>",
462
- "lstrip": true,
463
- "normalized": false,
464
- "rstrip": true,
465
- "single_word": false,
466
- "special": true
467
- },
468
- "36055": {
469
- "content": "<extra_id_40>",
470
- "lstrip": true,
471
- "normalized": false,
472
- "rstrip": true,
473
- "single_word": false,
474
- "special": true
475
- },
476
- "36056": {
477
- "content": "<extra_id_39>",
478
- "lstrip": true,
479
- "normalized": false,
480
- "rstrip": true,
481
- "single_word": false,
482
- "special": true
483
- },
484
- "36057": {
485
- "content": "<extra_id_38>",
486
- "lstrip": true,
487
- "normalized": false,
488
- "rstrip": true,
489
- "single_word": false,
490
- "special": true
491
- },
492
- "36058": {
493
- "content": "<extra_id_37>",
494
- "lstrip": true,
495
- "normalized": false,
496
- "rstrip": true,
497
- "single_word": false,
498
- "special": true
499
- },
500
- "36059": {
501
- "content": "<extra_id_36>",
502
- "lstrip": true,
503
- "normalized": false,
504
- "rstrip": true,
505
- "single_word": false,
506
- "special": true
507
- },
508
- "36060": {
509
- "content": "<extra_id_35>",
510
- "lstrip": true,
511
- "normalized": false,
512
- "rstrip": true,
513
- "single_word": false,
514
- "special": true
515
- },
516
- "36061": {
517
- "content": "<extra_id_34>",
518
- "lstrip": true,
519
- "normalized": false,
520
- "rstrip": true,
521
- "single_word": false,
522
- "special": true
523
- },
524
- "36062": {
525
- "content": "<extra_id_33>",
526
- "lstrip": true,
527
- "normalized": false,
528
- "rstrip": true,
529
- "single_word": false,
530
- "special": true
531
- },
532
- "36063": {
533
- "content": "<extra_id_32>",
534
- "lstrip": true,
535
- "normalized": false,
536
- "rstrip": true,
537
- "single_word": false,
538
- "special": true
539
- },
540
- "36064": {
541
- "content": "<extra_id_31>",
542
- "lstrip": true,
543
- "normalized": false,
544
- "rstrip": true,
545
- "single_word": false,
546
- "special": true
547
- },
548
- "36065": {
549
- "content": "<extra_id_30>",
550
- "lstrip": true,
551
- "normalized": false,
552
- "rstrip": true,
553
- "single_word": false,
554
- "special": true
555
- },
556
- "36066": {
557
- "content": "<extra_id_29>",
558
- "lstrip": true,
559
- "normalized": false,
560
- "rstrip": true,
561
- "single_word": false,
562
- "special": true
563
- },
564
- "36067": {
565
- "content": "<extra_id_28>",
566
- "lstrip": true,
567
- "normalized": false,
568
- "rstrip": true,
569
- "single_word": false,
570
- "special": true
571
- },
572
- "36068": {
573
- "content": "<extra_id_27>",
574
- "lstrip": true,
575
- "normalized": false,
576
- "rstrip": true,
577
- "single_word": false,
578
- "special": true
579
- },
580
- "36069": {
581
- "content": "<extra_id_26>",
582
- "lstrip": true,
583
- "normalized": false,
584
- "rstrip": true,
585
- "single_word": false,
586
- "special": true
587
- },
588
- "36070": {
589
- "content": "<extra_id_25>",
590
- "lstrip": true,
591
- "normalized": false,
592
- "rstrip": true,
593
- "single_word": false,
594
- "special": true
595
- },
596
- "36071": {
597
- "content": "<extra_id_24>",
598
- "lstrip": true,
599
- "normalized": false,
600
- "rstrip": true,
601
- "single_word": false,
602
- "special": true
603
- },
604
- "36072": {
605
- "content": "<extra_id_23>",
606
- "lstrip": true,
607
- "normalized": false,
608
- "rstrip": true,
609
- "single_word": false,
610
- "special": true
611
- },
612
- "36073": {
613
- "content": "<extra_id_22>",
614
- "lstrip": true,
615
- "normalized": false,
616
- "rstrip": true,
617
- "single_word": false,
618
- "special": true
619
- },
620
- "36074": {
621
- "content": "<extra_id_21>",
622
- "lstrip": true,
623
- "normalized": false,
624
- "rstrip": true,
625
- "single_word": false,
626
- "special": true
627
- },
628
- "36075": {
629
- "content": "<extra_id_20>",
630
- "lstrip": true,
631
- "normalized": false,
632
- "rstrip": true,
633
- "single_word": false,
634
- "special": true
635
- },
636
- "36076": {
637
- "content": "<extra_id_19>",
638
- "lstrip": true,
639
- "normalized": false,
640
- "rstrip": true,
641
- "single_word": false,
642
- "special": true
643
- },
644
- "36077": {
645
- "content": "<extra_id_18>",
646
- "lstrip": true,
647
- "normalized": false,
648
- "rstrip": true,
649
- "single_word": false,
650
- "special": true
651
- },
652
- "36078": {
653
- "content": "<extra_id_17>",
654
- "lstrip": true,
655
- "normalized": false,
656
- "rstrip": true,
657
- "single_word": false,
658
- "special": true
659
- },
660
- "36079": {
661
- "content": "<extra_id_16>",
662
- "lstrip": true,
663
- "normalized": false,
664
- "rstrip": true,
665
- "single_word": false,
666
- "special": true
667
- },
668
- "36080": {
669
- "content": "<extra_id_15>",
670
- "lstrip": true,
671
- "normalized": false,
672
- "rstrip": true,
673
- "single_word": false,
674
- "special": true
675
- },
676
- "36081": {
677
- "content": "<extra_id_14>",
678
- "lstrip": true,
679
- "normalized": false,
680
- "rstrip": true,
681
- "single_word": false,
682
- "special": true
683
- },
684
- "36082": {
685
- "content": "<extra_id_13>",
686
- "lstrip": true,
687
- "normalized": false,
688
- "rstrip": true,
689
- "single_word": false,
690
- "special": true
691
- },
692
- "36083": {
693
- "content": "<extra_id_12>",
694
- "lstrip": true,
695
- "normalized": false,
696
- "rstrip": true,
697
- "single_word": false,
698
- "special": true
699
- },
700
- "36084": {
701
- "content": "<extra_id_11>",
702
- "lstrip": true,
703
- "normalized": false,
704
- "rstrip": true,
705
- "single_word": false,
706
- "special": true
707
- },
708
- "36085": {
709
- "content": "<extra_id_10>",
710
- "lstrip": true,
711
- "normalized": false,
712
- "rstrip": true,
713
- "single_word": false,
714
- "special": true
715
- },
716
- "36086": {
717
- "content": "<extra_id_9>",
718
- "lstrip": true,
719
- "normalized": false,
720
- "rstrip": true,
721
- "single_word": false,
722
- "special": true
723
- },
724
- "36087": {
725
- "content": "<extra_id_8>",
726
- "lstrip": true,
727
- "normalized": false,
728
- "rstrip": true,
729
- "single_word": false,
730
- "special": true
731
- },
732
- "36088": {
733
- "content": "<extra_id_7>",
734
- "lstrip": true,
735
- "normalized": false,
736
- "rstrip": true,
737
- "single_word": false,
738
- "special": true
739
- },
740
- "36089": {
741
- "content": "<extra_id_6>",
742
- "lstrip": true,
743
- "normalized": false,
744
- "rstrip": true,
745
- "single_word": false,
746
- "special": true
747
- },
748
- "36090": {
749
- "content": "<extra_id_5>",
750
- "lstrip": true,
751
- "normalized": false,
752
- "rstrip": true,
753
- "single_word": false,
754
- "special": true
755
- },
756
- "36091": {
757
- "content": "<extra_id_4>",
758
- "lstrip": true,
759
- "normalized": false,
760
- "rstrip": true,
761
- "single_word": false,
762
- "special": true
763
- },
764
- "36092": {
765
- "content": "<extra_id_3>",
766
- "lstrip": true,
767
- "normalized": false,
768
- "rstrip": true,
769
- "single_word": false,
770
- "special": true
771
- },
772
- "36093": {
773
- "content": "<extra_id_2>",
774
- "lstrip": true,
775
- "normalized": false,
776
- "rstrip": true,
777
- "single_word": false,
778
- "special": true
779
- },
780
- "36094": {
781
- "content": "<extra_id_1>",
782
- "lstrip": true,
783
- "normalized": false,
784
- "rstrip": true,
785
- "single_word": false,
786
- "special": true
787
- },
788
- "36095": {
789
- "content": "<extra_id_0>",
790
- "lstrip": true,
791
- "normalized": false,
792
- "rstrip": true,
793
- "single_word": false,
794
- "special": true
795
- }
796
- },
797
- "additional_special_tokens": [
798
- "<extra_id_0>",
799
- "<extra_id_1>",
800
- "<extra_id_2>",
801
- "<extra_id_3>",
802
- "<extra_id_4>",
803
- "<extra_id_5>",
804
- "<extra_id_6>",
805
- "<extra_id_7>",
806
- "<extra_id_8>",
807
- "<extra_id_9>",
808
- "<extra_id_10>",
809
- "<extra_id_11>",
810
- "<extra_id_12>",
811
- "<extra_id_13>",
812
- "<extra_id_14>",
813
- "<extra_id_15>",
814
- "<extra_id_16>",
815
- "<extra_id_17>",
816
- "<extra_id_18>",
817
- "<extra_id_19>",
818
- "<extra_id_20>",
819
- "<extra_id_21>",
820
- "<extra_id_22>",
821
- "<extra_id_23>",
822
- "<extra_id_24>",
823
- "<extra_id_25>",
824
- "<extra_id_26>",
825
- "<extra_id_27>",
826
- "<extra_id_28>",
827
- "<extra_id_29>",
828
- "<extra_id_30>",
829
- "<extra_id_31>",
830
- "<extra_id_32>",
831
- "<extra_id_33>",
832
- "<extra_id_34>",
833
- "<extra_id_35>",
834
- "<extra_id_36>",
835
- "<extra_id_37>",
836
- "<extra_id_38>",
837
- "<extra_id_39>",
838
- "<extra_id_40>",
839
- "<extra_id_41>",
840
- "<extra_id_42>",
841
- "<extra_id_43>",
842
- "<extra_id_44>",
843
- "<extra_id_45>",
844
- "<extra_id_46>",
845
- "<extra_id_47>",
846
- "<extra_id_48>",
847
- "<extra_id_49>",
848
- "<extra_id_50>",
849
- "<extra_id_51>",
850
- "<extra_id_52>",
851
- "<extra_id_53>",
852
- "<extra_id_54>",
853
- "<extra_id_55>",
854
- "<extra_id_56>",
855
- "<extra_id_57>",
856
- "<extra_id_58>",
857
- "<extra_id_59>",
858
- "<extra_id_60>",
859
- "<extra_id_61>",
860
- "<extra_id_62>",
861
- "<extra_id_63>",
862
- "<extra_id_64>",
863
- "<extra_id_65>",
864
- "<extra_id_66>",
865
- "<extra_id_67>",
866
- "<extra_id_68>",
867
- "<extra_id_69>",
868
- "<extra_id_70>",
869
- "<extra_id_71>",
870
- "<extra_id_72>",
871
- "<extra_id_73>",
872
- "<extra_id_74>",
873
- "<extra_id_75>",
874
- "<extra_id_76>",
875
- "<extra_id_77>",
876
- "<extra_id_78>",
877
- "<extra_id_79>",
878
- "<extra_id_80>",
879
- "<extra_id_81>",
880
- "<extra_id_82>",
881
- "<extra_id_83>",
882
- "<extra_id_84>",
883
- "<extra_id_85>",
884
- "<extra_id_86>",
885
- "<extra_id_87>",
886
- "<extra_id_88>",
887
- "<extra_id_89>",
888
- "<extra_id_90>",
889
- "<extra_id_91>",
890
- "<extra_id_92>",
891
- "<extra_id_93>",
892
- "<extra_id_94>",
893
- "<extra_id_95>"
894
- ],
895
- "clean_up_tokenization_spaces": false,
896
- "eos_token": "</s>",
897
- "extra_ids": 96,
898
- "extra_special_tokens": {},
899
- "legacy": true,
900
- "model_max_length": 1000000000000000019884624838656,
901
- "pad_token": "<pad>",
902
- "sp_model_kwargs": {},
903
- "tokenizer_class": "T5Tokenizer",
904
- "unk_token": "<unk>"
905
- }