We discuss the deceptively rich parametric testing problem of distinguishing between one-component and two-component mixtures. We start with the simple case of normal location mixtures, tracing the history of work on this problem from the discovery of a slowly diverging log-likelihood ratio statistic in the 1980s, through to it’s limiting extreme-value behaviour in the 1990s and early 2000s. We then discuss some interesting applications related to signal detection, multiple testing and variable selection in high-dimensional regression and classification problems, including the introduction of the higher criticism procedure. Finally we discuss recent work extending some limiting distribution and local power results beyond normal location mixtures to mixtures of general one-parameter exponential families, including a special role played by variance-stabilising transformations. We also indicate how certain results may be extended to a more general class of models which include change-point and hidden Markov models.