2.2. Baum-Welch algorithm

One must first instanciate a BaumWelchLearner object in order to use this algorithm.

Unlike the k-Means case, there is no need to use CentroidCompatible observations.

Once it's been done, one can call the iterate(hmm, seqs) method in order to get a HMM that fits the observation sequences seqs better than its hmm argument.

The learn(hmm, seqs) method applies iterate(hmm, seqs) a certain number of times (see the BaumWelchLearner source code), taking a first approximated HMM as an argument. This first guess is very important since this algorithm only finds locally minimum values of its fitting function.

A scaled version of this class, BaumWelchScaledLearner, implements a scaling algorithm so as to avoid underflows when the learning is based on long observation sequences.