NeckNet

Model Description

NeckNet is an MLP model designed for the OmniNeck. It can predict both 6D force and 3D shape (mesh nodes) from the 6D motion of the ball.

Try it out on the Spaces demo.

Intended Use

This model is intended for researchers and developers working in robotics and tactile sensing. It can be used to enhance the capabilities of robotic systems by providing accurate predictions of force and shape based on tactile data.

To load the model:

from transformers import AutoModel

model = AutoModel.from_pretrained("han-xudong/necknet", trust_remote_code=True)
x = torch.zeros((1, 6))  # Example input: batch size of 1, 6D motion
output = model(x)

Or to load the ONNX version:

# Example code to load onnx
import onnxruntime as ort
import numpy as np
from huggingface_hub import hf_hub_download

onnx_model_path = hf_hub_download("han-xudong/necknet", filename="model.onnx")
ort_session = ort.InferenceSession(onnx_model_path)

# Example input
x = np.zeros((1, 6), dtype=np.float32)  # Batch size of 1, 6D motion
output = ort_session.run(None, {"motion": x})

Training Data

The model was trained on the ProSoRo-100K dataset, which includes a variety of motion, force, and shape data collected by finite element simulations.

Citation

If you use this model in your research, please cite:

@article{han2025anchoring,
    title={Anchoring Morphological Representations Unlocks Latent Proprioception in Soft Robots},
    author={Han, Xudong and Guo, Ning and Xu, Ronghan and Wan, Fang and Song, Chaoyang},
    journal={Advanced Intelligent Systems},
    volume={7},
    pages={e202500444},
    year={2025}
}
Downloads last month
40
Safetensors
Model size
4.27M params
Tensor type
F32
·
Video Preview
loading

Dataset used to train han-xudong/necknet

Space using han-xudong/necknet 1