--- license: cc-by-nc-4.0 size_categories: - 1K Raspberry Example 1 Raspberry Example 2 ## 🏷️ Annotation Format Note that the annotations follow the YOLO instance segmentation format. Please refer to [this page](https://docs.ultralytics.com/datasets/segment/) for more info. ## πŸ§ͺ How to read and display examples ```python from datasets import load_dataset from PIL import Image, ImageDraw import numpy as np import random # --- Configuration --- DATASET_NAME = "FBK-TeV/RaspGrade" SAMPLE_INDEX = 0 # Index of the sample to visualize from the 'valid' split OUTPUT_IMAGE = 'annotated_hub_image_fixed_colors.png' ALPHA = 128 # Transparency level for masks # Define a color map for different classes CLASS_COLORS = { 0: (255, 0, 0, ALPHA), # Red 1: (0, 255, 0, ALPHA), # Green 2: (0, 0, 255, ALPHA), # Blue 3: (255, 255, 0, ALPHA), # Yellow 4: (255, 0, 255, ALPHA) # Magenta # Add more colors for other class IDs if needed } def convert_normalized_polygon_to_pixels(polygon_normalized, width, height): polygon_pixels = (np.array(polygon_normalized).reshape(-1, 2) * np.array([width, height])).astype(int).flatten().tolist() return polygon_pixels if __name__ == "__main__": try: dataset = load_dataset(DATASET_NAME) if 'valid' not in dataset: raise ValueError(f"Split 'valid' not found in dataset '{DATASET_NAME}'") valid_dataset = dataset['valid'] if SAMPLE_INDEX >= len(valid_dataset): raise ValueError(f"Sample index {SAMPLE_INDEX} is out of bounds for the 'valid' split (size: {len(valid_dataset)})") sample = valid_dataset[SAMPLE_INDEX] original_image = sample['image'].convert("RGBA") width, height = original_image.size mask = Image.new('RGBA', (width, height), (0, 0, 0, 0)) mask_draw = ImageDraw.Draw(mask, 'RGBA') labels = sample['labels'] if isinstance(labels, list): for annotation in labels: if len(annotation) > 4: # Assuming YOLO format includes class and polygon class_id = int(annotation[0]) polygon_normalized = np.array(annotation[1:]).astype(float).reshape(-1, 2).flatten().tolist() polygon_pixels = convert_normalized_polygon_to_pixels(polygon_normalized, width, height) color = CLASS_COLORS.get(class_id, (255, 255, 255, ALPHA)) # Default to white if class_id not in map mask_draw.polygon(polygon_pixels, fill=color) annotated_image = Image.alpha_composite(original_image, mask) annotated_image.save(OUTPUT_IMAGE) print(f"Annotated image with fixed colors saved as {OUTPUT_IMAGE}") annotated_image.show() except Exception as e: print(f"An error occurred: {e}") ``` ## πŸ™ Acknowledgement
AGILEHAND logo

This work is supported by European Union’s Horizon Europe research and innovation programme under grant agreement No 101092043, project AGILEHAND (Smart Grading, Handling and Packaging Solutions for Soft and Deformable Products in Agile and Reconfigurable Lines.

## 🀝 Partners
FBK logo Santorsola logo
## πŸ“– Citation ```bibtex @article{mekhalfi2025raspgrade, title={The RaspGrade Dataset: Towards Automatic Raspberry Ripeness Grading with Deep Learning}, author={Mekhalfi, Mohamed Lamine and Chippendale, Paul and Poiesi, Fabio and Bonecher, Samuele and Osler, Gilberto and Zancanella, Nicola}, journal={arXiv preprint arXiv:2505.08537}, year={2025} } ```