LibreDAMOYOLOns

DAMO-YOLO-Ns (TinyNAS_mob backbone with depthwise + GiraffeNeckV2 with depthwise + ZeroHead with GFL/DFL, reg_max=7, last_kernel_size=1). 1.4M params, 416x416 input, repackaged for LibreYOLO.

COCO val 2017 mAP: 0.323 (per upstream README, at 416x416 input).

Source

Derived from tinyvision/DAMO-YOLO at commit 319572e.

The original damoyolo_nano_small.pth checkpoint was hosted on Alibaba's Aliyun bucket which is now offline. This weight was recovered from the Internet Archive's Wayback Machine, snapshot 20250307203141. Upstream only ever published the release_model/ckpt/before_distill/ flavor of the Nano series; no post-distill nano weights exist.

Copyright (c) 2021-2022 Alibaba Group Holding Limited. Licensed under the Apache License, Version 2.0.

Modifications

File rename only. damoyolo_nano_small.pth is bit-identical to LibreDAMOYOLOns.pt. LibreYOLO's nn.Module class hierarchy mirrors upstream's exactly so model.load_state_dict(...) is strict.

End-to-end bit-exact parity with upstream's pipeline verified at the backbone, neck, and head output level (0.0 absolute difference on identical input).

See libreyolo/models/damoyolo/ in the LibreYOLO source repository for the architecture port.

License

Apache License 2.0. See the LICENSE and NOTICE files in this repository.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including LibreYOLO/LibreDAMOYOLOns