We consider a mixture model with latent Bayesian network (MLBN) for a set of random vectors. Each random vector is associated with a latent state, given which the random vector is conditionally independent from other variables. The joint distribution of the states is governed by a Bayes net. Although specific types of MLBN have been used in diverse areas such as biomedical research and image analysis, the exact expectation–maximization (EM) algorithm for estimating the models can involve visiting all the combinations of states, yielding exponential complexity in the network size. A prominent exception is the Baum–Welch algorithm for the hidden Markov model, where the underlying graph topology is a chain. We hereby develop a new Baum–Welch algorithm on directed acyclic graph (BW-DAG) for the general MLBN and prove that it is an exact EM algorithm. BW-DAG provides insight on the achievable complexity of EM. For a tree graph, the complexity of BW-DAG is much lower than that of the brute-force EM.