Problem Definition: Given prior knowledge from multiple domains, improve topic modeling in the new domain.Knowledge in the form of s-set containing words sharing the same semantic meaning, e.g., \{Light, Heavy, Weight\}.A novel technique to transfer knowledge to improve topic models.Existing Knowledge-based models.DF-LDA [Andrzejewski et al., 2009], Seeded Model (e.g., [Mukherjee and Liu, 2012]).Two shortcomings: 1) Incapable of handling multiple senses, and Collapsed Gibbs Sampling and 2) Adverse effect of Knowledge.MDK-LDAGenerative Process For each topic $t \in \{1,...,T\}$ \\ i. Draw a per topic distribution over s-sets, $\varphi_t \sim \text{Dir}(\beta)$ \\ ii. For each s-set $t \in\{1,...,T\}$ \\ a) Draw a per topic, per s-set distribution over words, $\eta_{t,s} \sim \text{Dir}(\gamma)$ \\ For each document $m \in {1,...,M}$ \\ i. Draw $\theta_m \sim \text{Dir}(\alpha)$ \\ ii. For each word $w_{m,n}$ , where $n \in {1,..., N_m }$ \\ a) Draw a topic $z_{m,n} \sim \text{Mult}(θ_m)$ \\ b) Draw an s-set $s_{m,n} \sim \text{Mult}(\varphi_{z_{m,n}})$ \\ c) Emit $w_{m,n} \sim \text{Mult (\eta_{z_{m,n},s_{m,n}} )$ \\Plate NotationCollapsed Gibbs SamplingBlocked Gibbs Sampler Sample topic $z$ and s-set $s$ for word $w$ \begin{equation} \begin{split} P(z_i=t,s_i=s | \textbf{z}^{-i} \textbf{s}^{-i},\alpha,\beta,\gamma) \propto & \\ \frac{n_{m,t}^{-i}+\alpha}{\sum_{t^{'}=1}^T(n_{t,s}^{-i}+\alpha) }\times \frac{n_{t,s}^-i+\beta }{\sum_{s^{'}=1}^S(n_{t,s}^{-i}+\beta)}\times & \frac{n_{t,s,w_i}^{-i}+\gamma_s}{\sum_{v^{'}=1}^V(n_{t,s,v^{'}}^{ i}+\gamma_s)} \end{split} \end{equation} }Generalized Pólya Urn ModelGeneralized Pólya urn model [Mahmoud, 2008]When a ball is drawn, that ball is put back along with a certain number of balls of similar colors.Promoting s-set as a wholeIf a ball of color w is drawn, we put back $A_{s,w^{'},w}$ balls of each color $w^{'} \in {1,...,V}$ where w and $w^{'}$ share s-set $s$. \begin{equation} A_{s,w^{'},w}=\left\{ \begin{array}{ll} 1 & w=w^{'}\\ \sigma & w \in s, w^{'} \in s, w \neq w^{'}\\ 0 & \text{otherwise} \end{array} \right. \end{equation}Collapsed Gibbs Sampling \begin{equation} \begin{split} P(z_i=t,s_i=s | \textbf{z}^{-i},\textbf{s}^{-i},\alpha,\beta,\gamma, A) \propto \frac{n_{m,t}^{-i}+\alpha}{\sum_{t^{'}=1}^T(n_{t,s}^{ i}+\alpha) } & \\ \times\frac{\sum_{w_{'}=1}^V \sum_{v_{'}=1}^V A_{s,v_{'},w_{'}}*n_{t,s,v^{'}}^{-i}+\beta {\sum_{s^{'}=1}^S(n_{t,s}^{-i}+\beta)}\times\frac{n_{t,s,w_i}^{ i}+\gamma_s}{\sum_{v^{'}=1}^V(n_{t,s,v^{'}}^{-i}+\gamma_s)} & \end{split} \end{equation}ExperimentsDatasets: reviews from six domains from Amazon.com.Baseline ModelsLDA [Blei et al., 2003], LDA\_GPU [Mimno et al., 2011], and DF-LDA [Andrzejewski et al., 2009].Topic Discovery ResultsEvaluation measure: Precision @ n (p @ n).Quantitative results in Table 1, Qualitative results in Table Objective EvaluationTopic Coherence [Mimno et al., 2011].