Update README.md
Browse files
README.md
CHANGED
|
@@ -151,7 +151,7 @@ The dataset does not contain any sensitive identifying information (i.e., does n
|
|
| 151 |
|
| 152 |
# Considerations of Using the Data
|
| 153 |
## Social Impact of Dataset
|
| 154 |
-
This dataset may be useful for researchers in developing and benchmarking forensics methods. Such methods may aid users in better understanding the given image. However, we believe the classifiers, at least the ones that we have trained or benchmarked, still show far too high error rates to be used directly in the wild
|
| 155 |
|
| 156 |
## Discussion of Biases
|
| 157 |
The dataset has been primarily sampled from LAION captions. This may introduce biases that could be present in web-scale data (e.g., favoring human photos instead of other categories of photos). In addition, a vast majority of the generators we collect are derivatives of Stable Diffusion, which may introduce bias towards detecting certain types of generators.
|
|
|
|
| 151 |
|
| 152 |
# Considerations of Using the Data
|
| 153 |
## Social Impact of Dataset
|
| 154 |
+
This dataset may be useful for researchers in developing and benchmarking forensics methods. Such methods may aid users in better understanding the given image. However, we believe the classifiers, at least the ones that we have trained or benchmarked, still show far too high error rates to be used directly in the wild, and can lead to unwanted consequences (e.g., falsely accusing an author of creating fake images or allowing generated content to be certified as real).
|
| 155 |
|
| 156 |
## Discussion of Biases
|
| 157 |
The dataset has been primarily sampled from LAION captions. This may introduce biases that could be present in web-scale data (e.g., favoring human photos instead of other categories of photos). In addition, a vast majority of the generators we collect are derivatives of Stable Diffusion, which may introduce bias towards detecting certain types of generators.
|