--- license: mit task_categories: - object-detection - image-segmentation - image-classification language: - en tags: - agriculture - computer-vision - fruit-detection - instance-segmentation - precision-agriculture - ripeness-assessment - agricultural-robotics size_categories: - 1K 0.85) - **Expert review**: 10% agricultural specialist validation - **Polygon precision**: Minimum 8 vertices, detailed boundary delineation ### Species-Specific Criteria #### Color-Based Ripeness (Apples, Tomatoes, Cherries, Peppers) - **Ripe**: >75% characteristic color coverage - **Unripe**: <25% color development - **Spoiled**: Brown/black discoloration, visible mold #### Size-Based Ripeness (Cucumbers, Pears) - **Ripe**: 80-100% of variety-specific size range - **Unripe**: <80% expected size - **Spoiled**: Yellowing, soft spots, wrinkled skin #### Texture-Based Ripeness (Strawberries, Raspberries) - **Ripe**: Uniform color, firm but yielding texture - **Unripe**: White/green areas, hard texture - **Spoiled**: Soft spots, mold, collapsed structure ## Usage Examples ### Loading the Dataset ```python from datasets import load_dataset # Load complete dataset dataset = load_dataset("TheCoffeeAddict/SmartHarvest") # Load specific split train_data = load_dataset("TheCoffeeAddict/SmartHarvest", split="train") # Access sample sample = dataset['train'][0] image = sample['image'] annotations = sample['annotations'] ``` ### PyTorch Integration ```python import torch from torch.utils.data import Dataset from torchvision import transforms from datasets import load_dataset class SmartHarvestDataset(Dataset): def __init__(self, split="train", transform=None): self.dataset = load_dataset("TheCoffeeAddict/SmartHarvest", split=split) self.transform = transform def __len__(self): return len(self.dataset) def __getitem__(self, idx): sample = self.dataset[idx] image = sample['image'] target = { 'boxes': torch.tensor(sample['bboxes']), 'labels': torch.tensor(sample['labels']), 'masks': torch.tensor(sample['masks']) } if self.transform: image = self.transform(image) return image, target # Usage transform = transforms.Compose([ transforms.Resize((800, 800)), transforms.ToTensor(), ]) dataset = SmartHarvestDataset(split="train", transform=transform) ``` ### Data Visualization ```python import matplotlib.pyplot as plt import numpy as np def visualize_sample(sample): image = sample['image'] annotations = sample['annotations'] fig, ax = plt.subplots(1, 1, figsize=(12, 8)) ax.imshow(image) for ann in annotations: # Draw bounding box x, y, w, h = ann['bbox'] rect = plt.Rectangle((x, y), w, h, fill=False, color='red', linewidth=2) ax.add_patch(rect) # Add label species = ann['species'] ripeness = ann['ripeness'] ax.text(x, y-5, f"{species}-{ripeness}", color='red', fontsize=10) ax.set_title("SmartHarvest Sample Annotation") plt.show() # Visualize first sample sample = dataset['train'][0] visualize_sample(sample) ``` ## Baseline Results ### Model Performance (Apple-Cherry Subset) Trained Mask R-CNN with ResNet-50 backbone: | Metric | Value | Description | |--------|-------|-------------| | **AP@0.5** | **22.49%** | Average precision at IoU=0.5 | | **AP@0.75** | **7.98%** | Average precision at IoU=0.75 | | **COCO mAP** | **60.63%** | Mean AP across IoU 0.5-0.95 | ### Per-Class Performance | Class | AP@0.5 | Notes | |-------|--------|--------| | Apple-Ripe | 10.45% | Challenging due to color variation | | Apple-Unripe | 25.00% | Better defined characteristics | | Apple-Spoiled | **32.60%** | Distinctive visual features | | Cherry-Ripe | 18.20% | Small size challenges | | Cherry-Unripe | 17.10% | Consistent with apple pattern | | Cherry-Spoiled | **31.56%** | Best performance per species | *Code available at: https://github.com/Maksim3l/SmartHarvest* ## Considerations for Use ### Strengths - **Real-world applicability**: Natural garden conditions with authentic challenges - **Multi-species coverage**: Broad agricultural applicability - **Expert validation**: Agricultural specialist involvement in annotation - **Detailed annotations**: Polygon-level segmentation for precise localization - **Ripeness granularity**: Practical quality assessment categories ### Limitations - **Geographic bias**: Limited to specific growing regions - **Seasonal bias**: Collection timing affects ripeness distribution - **Equipment bias**: Single camera system characteristics - **Scale limitations**: Limited images per species for production deployment - **Class imbalance**: Varying representation across ripeness states ### Recommended Applications - **Research benchmarking**: Computer vision method evaluation - **Algorithm development**: Detection and segmentation model training - **Educational use**: Agricultural computer vision teaching - **Prototype development**: Proof-of-concept agricultural systems ### Usage Considerations - **Data augmentation**: Recommended for training robustness - **Cross-validation**: Stratified splits to maintain species balance - **Evaluation metrics**: Use agricultural-relevant metrics beyond standard CV measures - **Deployment testing**: Validate on target agricultural environments ## Ethical Considerations ### Data Privacy - **Image sources**: Publicly available images or consent-obtained private collections - **Location privacy**: No GPS coordinates or specific farm identifiers included - **Farmer consent**: Proper permissions obtained for orchard data collection ### Bias and Fairness - **Geographic diversity**: Active efforts to include multiple growing regions - **Seasonal representation**: Multiple collection periods to reduce temporal bias - **Equipment standardization**: Documentation of capture conditions for bias awareness ### Environmental Impact - **Sustainable agriculture**: Supporting precision farming for reduced resource use - **Technology access**: Open-source approach for global accessibility - **Local adaptation**: Encouragement of regional dataset development ## Citation If you use this dataset in your research, please cite: ```bibtex @inproceedings{loknar2025comprehensive, title={Comprehensive Multi-Species Fruit Ripeness Dataset Construction: From Eight-Species Collection to Focused Apple-Cherry Detection}, author={Loknar, Maksim and Mlakar, Uroš}, booktitle={Student Computing Research Symposium}, year={2025}, organization={University of Maribor}, url={https://huggingface.co/datasets/TheCoffeeAddict/SmartHarvest} } ``` ## Dataset Card Contact **Authors**: Maksim Loknar, Uroš Mlakar **Institution**: Faculty of Electrical Engineering and Computer Science, University of Maribor, Slovenia **Email**: maksim.loknar@student.um.si, uros.mlakar@um.si **Project Page**: https://github.com/Maksim3l/SmartHarvest For questions about dataset usage, additional species requests, or collaboration opportunities, please open an issue in the GitHub repository or contact the authors directly.