parquet-converter commited on
Commit
9b58f7b
·
1 Parent(s): 991aac7

Update parquet files

Browse files
README.md DELETED
@@ -1,20 +0,0 @@
1
- ---
2
- dataset_info:
3
- features:
4
- - name: bert_token
5
- sequence: int64
6
- - name: gpt2_token
7
- sequence: int64
8
- splits:
9
- - name: train
10
- num_bytes: 173553456.7202345
11
- num_examples: 551455
12
- - name: test
13
- num_bytes: 261864.0
14
- num_examples: 1000
15
- download_size: 42652803
16
- dataset_size: 173815320.7202345
17
- ---
18
- # Dataset Card for "amazon_tokenized"
19
-
20
- [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
data/test-00000-of-00001-14829a0c93d549bf.parquet → guangyil--amazon_tokenized/parquet-test.parquet RENAMED
File without changes
data/train-00000-of-00001-47e8465fdbcaa2c9.parquet → guangyil--amazon_tokenized/parquet-train.parquet RENAMED
File without changes