Convert dataset sizes from base 2 to base 10 in the dataset card
#2
by
albertvillanova
HF Staff
- opened
README.md
CHANGED
|
@@ -134,9 +134,9 @@ dataset_info:
|
|
| 134 |
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
| 135 |
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
| 136 |
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
| 137 |
-
- **Size of downloaded dataset files:**
|
| 138 |
-
- **Size of the generated dataset:**
|
| 139 |
-
- **Total amount of disk used:**
|
| 140 |
|
| 141 |
### Dataset Summary
|
| 142 |
|
|
@@ -161,9 +161,9 @@ with neutral label
|
|
| 161 |
|
| 162 |
#### dgem_format
|
| 163 |
|
| 164 |
-
- **Size of downloaded dataset files:**
|
| 165 |
-
- **Size of the generated dataset:** 7.
|
| 166 |
-
- **Total amount of disk used:**
|
| 167 |
|
| 168 |
An example of 'train' looks as follows.
|
| 169 |
```
|
|
@@ -172,9 +172,9 @@ An example of 'train' looks as follows.
|
|
| 172 |
|
| 173 |
#### predictor_format
|
| 174 |
|
| 175 |
-
- **Size of downloaded dataset files:**
|
| 176 |
-
- **Size of the generated dataset:**
|
| 177 |
-
- **Total amount of disk used:**
|
| 178 |
|
| 179 |
An example of 'validation' looks as follows.
|
| 180 |
```
|
|
@@ -183,9 +183,9 @@ An example of 'validation' looks as follows.
|
|
| 183 |
|
| 184 |
#### snli_format
|
| 185 |
|
| 186 |
-
- **Size of downloaded dataset files:**
|
| 187 |
-
- **Size of the generated dataset:**
|
| 188 |
-
- **Total amount of disk used:**
|
| 189 |
|
| 190 |
An example of 'validation' looks as follows.
|
| 191 |
```
|
|
@@ -194,9 +194,9 @@ An example of 'validation' looks as follows.
|
|
| 194 |
|
| 195 |
#### tsv_format
|
| 196 |
|
| 197 |
-
- **Size of downloaded dataset files:**
|
| 198 |
-
- **Size of the generated dataset:** 5.
|
| 199 |
-
- **Total amount of disk used:**
|
| 200 |
|
| 201 |
An example of 'validation' looks as follows.
|
| 202 |
```
|
|
|
|
| 134 |
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
| 135 |
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
| 136 |
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
| 137 |
+
- **Size of downloaded dataset files:** 56.70 MB
|
| 138 |
+
- **Size of the generated dataset:** 49.09 MB
|
| 139 |
+
- **Total amount of disk used:** 105.79 MB
|
| 140 |
|
| 141 |
### Dataset Summary
|
| 142 |
|
|
|
|
| 161 |
|
| 162 |
#### dgem_format
|
| 163 |
|
| 164 |
+
- **Size of downloaded dataset files:** 14.18 MB
|
| 165 |
+
- **Size of the generated dataset:** 7.83 MB
|
| 166 |
+
- **Total amount of disk used:** 22.01 MB
|
| 167 |
|
| 168 |
An example of 'train' looks as follows.
|
| 169 |
```
|
|
|
|
| 172 |
|
| 173 |
#### predictor_format
|
| 174 |
|
| 175 |
+
- **Size of downloaded dataset files:** 14.18 MB
|
| 176 |
+
- **Size of the generated dataset:** 10.19 MB
|
| 177 |
+
- **Total amount of disk used:** 24.37 MB
|
| 178 |
|
| 179 |
An example of 'validation' looks as follows.
|
| 180 |
```
|
|
|
|
| 183 |
|
| 184 |
#### snli_format
|
| 185 |
|
| 186 |
+
- **Size of downloaded dataset files:** 14.18 MB
|
| 187 |
+
- **Size of the generated dataset:** 25.77 MB
|
| 188 |
+
- **Total amount of disk used:** 39.95 MB
|
| 189 |
|
| 190 |
An example of 'validation' looks as follows.
|
| 191 |
```
|
|
|
|
| 194 |
|
| 195 |
#### tsv_format
|
| 196 |
|
| 197 |
+
- **Size of downloaded dataset files:** 14.18 MB
|
| 198 |
+
- **Size of the generated dataset:** 5.30 MB
|
| 199 |
+
- **Total amount of disk used:** 19.46 MB
|
| 200 |
|
| 201 |
An example of 'validation' looks as follows.
|
| 202 |
```
|