Spaces:
No application file
No application file
A newer version of the Gradio SDK is available: 6.11.0
metadata
title: Geodect
emoji: π»
colorFrom: gray
colorTo: green
sdk: gradio
sdk_version: 6.9.0
app_file: app.py
pinned: false
license: mit
short_description: AI-assisted computer vision for 3D geological modeling.
βοΈ Geodect: Geological 3D & Mapping
Geodect is an AI-assisted computer vision tool designed to convert field video of geological surfaces or rock samples into clean 3D models and high-resolution 2D orthomosaic maps.
π Key Features
- 2D Orthomosaic Mapping: Automatically extract, align, and stitch video frames into a single, high-resolution undistorted map.
- 3D Sample Reconstruction: Transform video orbits of rock samples into interactive
.objmeshes using Structure from Motion (SfM). - Intelligent Background Removal: Designed to segment and remove unwanted objects (like hands or tools) to focus on the geological surface.
- CPU Optimized: Built to run efficiently within a Hugging Face Space without requiring high-end GPU resources.
π Technical Workflow
- Extraction: OpenCV extracts optimal frames from uploaded field video.
- Processing: * Orthomosaic Mode: Detects features via SIFT/ORB and estimates homography for seamless stitching.
- 3D Mode: Estimates camera poses and generates a dense point cloud converted into a surface mesh via Trimesh.
- Visualization: Interactive 3D rendering and high-res image downloads provided via the Gradio interface.
π Architecture
This project is implemented as a lightweight four-file Python application:
app.py: The main Gradio interface and pipeline controller.Orthomosaic.py: The 2D stitching and blending engine.ThreeDimagery.py: The 3D reconstruction and SfM module.requirements.txt: Environment dependencies including OpenCV, SciPy, and Trimesh.
π€ Credits & Acknowledgments
Developer: Adedoyin Ifeoluwa James
X: https://x.com/IAdedoyin64700
portfolio website: https://adedoyinjames-portfolio.vercel.app
blog: https://theadedoyinjournal.wordpress.com
Technical Resources: This project was developed with the technical support and computational resources of NORA RESEARCH LAB.