Instructions to use cvtechniques/VideoGameHandGestures with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- ultralytics
How to use cvtechniques/VideoGameHandGestures with ultralytics:
from ultralytics import YOLOvv8 model = YOLOvv8.from_pretrained("cvtechniques/VideoGameHandGestures") source = 'http://images.cocodataset.org/val2017/000000039769.jpg' model.predict(source=source, save=True) - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -4,7 +4,7 @@ This model detects hand gestures for use as input controls for video games. It u
|
|
| 4 |
The goal of the project is to explore whether computer vision–based gesture recognition can provide a low-cost and accessible alternative to traditional game controllers.
|
| 5 |
|
| 6 |
### Training Approach
|
| 7 |
-
The model was trained using the nano version of YOLOv8 through the Ultralytics training framework.
|
| 8 |
The model was trained from pretrained YOLOv8n weights and fine-tuned on a custom hand gesture dataset.
|
| 9 |
|
| 10 |
### Intended Use Cases
|
|
|
|
| 4 |
The goal of the project is to explore whether computer vision–based gesture recognition can provide a low-cost and accessible alternative to traditional game controllers.
|
| 5 |
|
| 6 |
### Training Approach
|
| 7 |
+
The model was trained using the nano version of YOLOv8 (YOLOv8n) through the Ultralytics training framework.
|
| 8 |
The model was trained from pretrained YOLOv8n weights and fine-tuned on a custom hand gesture dataset.
|
| 9 |
|
| 10 |
### Intended Use Cases
|