Datasets:
File size: 7,729 Bytes
ff9b9fe |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 |
---
pretty_name: Plastic Caps Dataset
language: en
license: cc-by-4.0
task_categories:
- image-classification
tags:
- plastic
- caps
- classification
- computer-vision
- multi-view
- controlled-environment
- grouped-splits
- recycling
- sustainability
size_categories:
- 1K<n<10K
---
# Plastic Caps Dataset
## Dataset Summary
This dataset contains images of plastic caps categorized by the color. The dataset is divided into training, validation, and test splits using a leakage-safe strategy that groups all images of the same physical cap together in the same split while maintaining balanced color distributions across subsets.
### Supported Tasks
- The initial and primary purpose of this dataset is **Color classification** (target label: `color_category`)
## Dataset Structure
### Data Fields
- **image**: Image object containing the image data and associated metadata (loaded by the Hugging Face `datasets` library).
- **cap_id**: A string (UUID) as the unique identifier for each physical cap.
- **cap_type**: Categorical label indicating the physical design of the plastic cap: `disc/press top cap`, `dual flip-top cap`, `flip-top cap`, `hinged cap`, `push-pull/sports cap`, `standard cap`, `tethered yorker cap`, `twist cap` and `yorker spout cap`.
- **cap_rotation**: An integer indicating the rotation of the cap, `12`, `3`, `6` meaning the cap is rotated to the 12 o'clock, 3 o'clock and 6 o'clock positions, respectively.
- **cap_state**: A string, for normal caps: "0", "1" indicating if the cap is closed or open. In the case of dual-flip caps "00", "01", "10", "11" indicating the state of each lid (both closed, lid a open, lid b open, both open).
- **cap_facing_position**: An integer, either `0` or `1`, indicating if the cap is facing downwards or upwards, respectively.
- **camera_angle**: An integer, indicating the angle of the camera when the image was taken (`90`, `45` or `0` degrees).
- **color_category**: Categorical label representing grouped color classes used in recycling workflows, following [Tapitas Oportunidades – Colores](https://www.tapitasoportunidades.com/colores/) (one of `red/orange/pink/fuchsia/brown`, `yellow/gold`, `blue`, `green`, `white/transparent`, `silver/gray` and `black`).
### Data Instances
#### Images
- **Total images:** 2662
**Color distribution (images):**
- **black**: `263`
- **blue**: `278`
- **green**: `412`
- **red / orange / pink / fuchsia / brown**: `583`
- **silver / gray**: `94`
- **white / transparent**: `836`
- **yellow / gold**: `196`
#### Caps (Unique Physical Objects)
- **Total unique caps:** 239
**Color distribution (caps):**
- **black**: `25`
- **blue**: `28`
- **green**: `33`
- **red / orange / pink / fuchsia / brown**: `46`
- **silver / gray**: `12`
- **white / transparent**: `77`
- **yellow / gold**: `18`
### Data Splits
The dataset is split into **training**, **validation**, and **test** subsets using a deterministic and stratified splitting strategy designed to prevent data leakage and preserve distributional balance.
- **Grouping key:** All samples are grouped by `cap_id`, ensuring that images of the same physical object do not appear across different splits.
- **Stratification:** The split is stratified by `color_category` to maintain consistent distributions across subsets.
- **Split ratios:**
* **Train:** `80%`
* **Validation:** `10%`
* **Test:** `10%`
#### Images / Color distribution (%)
| Dataset Split | Number of Images | Black | Blue | Green | Red / orange / pink / fuchsia / brown | Silver / gray | White / transparent | Yellow / gold |
|--------------|-----------------:|------:|------:|------:|------:|------:|------:|------:|
| Train | 2,110 | 11.0% | 10.4% | 14.4% | 24.0% | 2.8% | 30.1% | 7.1% |
| Validation | 254 | 4.7% | 7.1% | 18.1% | 18.1% | 2.4% | 44.9% | 4.7% |
| Test | 298 | 6.0% | 13.4% | 20.8% | 10.1% | 9.4% | 28.9% | 11.4% |
#### Caps / Color distribution (%)
| Dataset Split | Number of Caps | Black | Blue | Green | Red / orange / pink / fuchsia / brown | Silver / gray | White / transparent | Yellow / gold |
|--------------|---------------:|------:|------:|------:|------:|------:|------:|------:|
| Train | 191 | 10.5% | 11.5% | 13.6% | 19.4% | 5.2% | 32.5% | 7.3% |
| Validation | 24 | 8.3% | 12.5% | 16.7% | 16.7% | 4.2% | 33.3% | 8.3% |
| Test | 24 | 12.5% | 12.5% | 12.5% | 20.8% | 4.2% | 29.2% | 8.3% |
## Dataset Creation
This dataset contains images of plastic caps collected in-house by the author. Each cap is labeled with a color category, following [Tapitas Oportunidades – Colores](https://www.tapitasoportunidades.com/colores/), and a cap type, defined as metadata. All images are provided at their original resolution of **224×224** pixels in RGB format.
The cap type determines the image capture strategy, including the number of images per cap, camera angles, rotational variations, facing position, and cap state (open or closed). This ensures that each cap is represented through multiple views appropriate to its physical design, providing diverse visual data for downstream tasks.
These plastic caps are used in non-profit initiatives to produce accessible products, such as retractable walkways and amphibious chairs, as described on the [Donatapa website](https://costaricaturismoaccesible.com/donatapa/productos/). The labels reflect both practical usage categories and the intended applications of the caps.
### Curation Rationale
The plastic caps dataset was designed to support research in computer vision, specifically for classification and analysis of colors and cap types in applications related to social impact and sustainability.
### Source Data
The data was obtained by capturing multi-view images of plastic caps using a fixed camera setup in a controlled environment. No third-party datasets were used.
### Annotations
The dataset does not contain any additional annotations.
## Personal and Sensitive Information
The dataset does not contain any personal or sensitive information.
## Versioning
We use [Semantic Versioning (SemVer)](https://semver.org/) for versioning our dataset, meaning we follow the **MAJOR.MINOR.PATCH** version pattern and the following versioning policy:
**MAJOR version**
Incremented when changes **break backward compatibility** including:
- Schema changes (renaming or removing fields)
- Data type changes.
- Changes in the semantic meaning of existing fields
- Modifications to label definitions or class taxonomies
- Restructuring of dataset splits (train/validation/test)
- Any change that invalidates direct comparison with previous results
**MINOR version**
Incremented for **backward-compatible extensions** such as:
- Addition of new columns or metadata fields
- Addition of new classes without modifying existing ones
- Inclusion of additional samples or images
- Addition of new dataset splits
- Improvements to documentation or dataset metadata
**PATCH version**
Incremented for **non-semantic corrections** like:
- Fixes to metadata errors
- Small-scale correction of mislabeled samples
- Replacement of corrupted files
- Corrections to dataset card
For the versions available, see [the tags](https://huggingface.co/datasets/sebastiangv/plastic-caps/tree/main) in the dataset repository.
## License
This dataset is released under [Creative Commons Attribution 4.0 International (**CC-BY-4.0**)](https://creativecommons.org/licenses/by/4.0/) license.
## Citation
If you use this dataset in your research, please cite:
```bibtex
@dataset{garcia2026plasticcaps,
title = {Plastic Caps Dataset},
author = {García, Sebastián},
year = {2026},
publisher = {Hugging Face},
url = {https://huggingface.co/datasets/sebastiangv/plastic-caps}
}
``` |