DeepTFUS: variant A (soft-argmax mild fine-tune)

A reproduction attempt of DeepTFUS, proposed by Srivastav et al. (arXiv:2505.12998).

Fine-tune of masonwang025/deeptfus-base that adds a soft-argmax focal-position L1 term to the loss at gentle weight, to test whether any position-aware aux signal closes the focal_position_error_mm gap that the base reproduction left open.

⭐ Conservative recipe: the only fine-tune variant where max_p improves over baseline, but focal_mm plateaus at a modest 10% median gain.

Modification (vs base)

Single new loss term added on top of the paper recipe:

L_focal = || soft_argmax(P̂_norm, τ=0.05) − argmax(P_gt_norm) ||_1
total   = L_paper + 1e-5 · L_focal

3-epoch linear warmup ramp from 0 to 1e-5. Paper's gradient_L1=0.1 anchor kept on. Fine-tune ran 12 epochs from base ckpt at lr=3e-5 (1/30 of base's peak); shipped checkpoint is ckpt_epoch_007.pt (best val_focal_mm in plateau).

Test results (n = 597)

metric paper base A (this model) Δ vs base
relative_l2 median 0.394 0.369 0.372 +0.003 (within budget)
relative_l2 mean 0.414 0.384 0.389 +0.005
focal_position_error_mm median 2.45 5.15 4.64 −0.51 mm (−10%)
focal_position_error_mm mean 2.89 6.49 5.60 −0.89 mm
max_pressure_error median 0.166 0.217 0.200 −0.017
max_pressure_error mean 0.199 0.225 0.204 −0.021
focal_pressure_error median : 0.528 0.487 −0.041
focal_iou_fwhm median : 0.143 0.148 +0.004
inference_latency_s median : 0.233 0.232 unchanged

Other variants and discussion

See the Collection for the other 5 variants, and the project page for the full reproduction story, interactive viewer, and discussion of trade-offs.

Usage

from huggingface_hub import hf_hub_download
import torch

ckpt = torch.load(
    hf_hub_download("masonwang025/deeptfus-ft-a-softargmax-mild", "ckpt_best.pt"),
    map_location="cpu", weights_only=False,
)

Model code: github.com/masonwang025/deeptfus.

Citation & License

Paper: Srivastav et al., arXiv:2505.12998, 2025.

License: CC-BY-NC-ND-4.0, matching the TFUScapes dataset.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for masonwang025/deeptfus-ft-a-softargmax-mild

Finetuned
(5)
this model

Dataset used to train masonwang025/deeptfus-ft-a-softargmax-mild

Collection including masonwang025/deeptfus-ft-a-softargmax-mild

Paper for masonwang025/deeptfus-ft-a-softargmax-mild