Spaces:
Running
Running
Update README.md
Browse files
README.md
CHANGED
|
@@ -25,6 +25,8 @@ We study how to reduce bias in CLIP-style models by combining:
|
|
| 25 |
- **Large-scale WebDataset shards** of synthetic / hybrid image–text data.
|
| 26 |
- **Eval tools and benchmarks** for analysing bias and fairness in CLIP-like models.
|
| 27 |
|
|
|
|
|
|
|
| 28 |
If you use our resources, please consider citing the SynthFairCLIP project.
|
| 29 |
|
| 30 |
---
|
|
|
|
| 25 |
- **Large-scale WebDataset shards** of synthetic / hybrid image–text data.
|
| 26 |
- **Eval tools and benchmarks** for analysing bias and fairness in CLIP-like models.
|
| 27 |
|
| 28 |
+
[](https://github.com/lluisgomez/SynthFairCLIP/tree/main/evals)
|
| 29 |
+
|
| 30 |
If you use our resources, please consider citing the SynthFairCLIP project.
|
| 31 |
|
| 32 |
---
|