OverviewDTF = Efficiently learnable non-parametric CRFs for discrete image labelling tasks• All factors (unary, pairwise, higher-order) are represented by decision trees• Decision trees are non-parametric• Efficient training of millions of parameters using pseudo-likelihoodFormallyGraphical Model:Factor typesFactor GraphEnergyEnergy linear in wExample pairwise factorSpecial Cases• Unary factors only = Decision Forest, with learned leaf node distributionsZero-depth trees (pairwise factors) = MRF• Conditional (pairwise factors) = CRFAlgorithm - OverviewTraining1.Define connective structure (factor types)2.Train all decision trees (split functions) separately3.Jointly optimize all weightsTesting (2 options)•“Unroll” factor graph:run: BP, TRW, QPBO, etc.•Don’t “unroll” factor graph:run Gibbs Sampling; Simulated AnnealingTraining of weights “w”•Maximum Pseudo-Likelihood training, convex optimization problemConverges in practice after 150-200 L-BFGS iterationsEfficient even for large graphs (e.g. 12 connected, 1.47M weights, 22mins)•Is parallel on the variable level•Variable sub-sampling possibleCode will be made available next month!