Overview DTF = Efficiently learnable non-parametric CRFs for discrete image labelling tasks • All factors (unary, pairwise, higher-order) are represented by decision trees • Decision trees are non-parametric • Efficient training of millions of parameters using pseudo-likelihood Formally
Graphical Model: Factor types
Factor Graph Energy Energy linear in w Example pairwise factor
Special Cases • Unary factors only = Decision Forest, with learned leaf node distributions Zero-depth trees (pairwise factors) = MRF • Conditional (pairwise factors) = CRF
Algorithm - Overview Training 1.Define connective structure (factor types) 2.Train all decision trees (split functions) separately 3.Jointly optimize all weights Testing (2 options) •“Unroll” factor graph: run: BP, TRW, QPBO, etc. •Don’t “unroll” factor graph: run Gibbs Sampling; Simulated Annealing Training of weights “w” •Maximum Pseudo-Likelihood training, convex optimization problem Converges in practice after 150-200 L-BFGS iterations Efficient even for large graphs (e.g. 12 connected, 1.47M weights, 22mins) •Is parallel on the variable level •Variable sub-sampling possible Code will be made available next month!