Expectation–maximization algorithm
An expectation-maximization algorithm is an algorithm for finding maximum likelihood estimates of parameters in probabilistic models, by alternately computed an expected value and maximizing the expected value with respect to some parameters.
"Expectation-maximization" (EM) is a description of a class of related algorithms, not a particular algorithm; EM is a recipe or meta-algorithm which is used to devise particular algorithms. The Baum-Welch algorithm is an example of an EM algorithm.
There are other methods for finding maximum likelihood estimates, such as gradient descent or variations of the Gauss-Newton method. EM algorithms may be easier to formulate than some other methods, which accounts in some part for their popularity.