python 隐马尔科夫_Python的隐马尔科夫HMMLearn库的应用教学
by \(\boldsymbol{\pi}\), \(\mathbf{A}\) and \(\boldsymbol{\theta}\).
There are three fundamental problems for HMMs:
Given the model parameters and observed data, estimate the optimal sequence of hidden
states.
Given the model parameters and observed data, calculate the likelihood of the data.
Given just the observed data, estimate the model parameters.
The first and the second problem can be solved by the dynamic programming algorithms
known as the Viterbi algorithm and the Forward-Backward algorithm, respectively. The last
one can be solved by an iterative Expectation-Maximization (EM) algorithm, known as the
Baum-Welch algorithm.
References: