The maximum likelihood method is the gold standard in molecular phylogenetics and accounts for nearly half of all published phylogenetic trees, but the method has a strange phylogenetic bias so far not explored. If the aligned sequences are equidistant from each other with the true tree being a star tree, then the likelihood method is incapable of recovering it unless the sequences are either identical or extremely diverged. Here I analytically demonstrate this "starless" bias and identify the source for the bias. In contrast, distance-based methods (with the least-squares method for branch evaluation and either minimum evolution or least-squares criterion for choosing the best tree) do not have this bias. The finding sheds light on the star-tree paradox in Bayesian phylogenetic inference.
maximum likelihood, molecular phylogenetics, distance-based phylogenetic method, starless, star-tree paradox