python implementation of classic probabilistic graphical models
HMM (Hidden Markov Model)
I implement a tensor version (using numpy) of HMM for three classic problems of HMM. For any non-sequential/non-recursive, I bypass the for-loop by using tensor operations. Thus, my numpy implementation could be easily adapted by tensorflow, pytorch.
-
The Evaluation problem: (forward/backward messages) Given π=(π,π΄,π΅) and a sequence of observation π=(π1,π2,ππ) , estimate P(π|π).
-
The Learning problem: (Baum-Welch algorithm, aka EM-algorithm) Given π=(π1,π2,ππ) , estimate best fit parameters π=(π,π΄,π΅).
-
The Decoding problem: (Viterbi's algorithm) Given π=(π,π΄,π΅) and a sequence of observation π=(π1,π2,ππ) , estimate hidden states.
Please see directly the jupyter notebook HMM.ipynb for more details.
Project ongoing: (check list)
- tensor version of basic HMM ...done.
- discrete observation space...done.
- D trajectories version of HMM...done.
- Problem 1, 2...done.
- Problem 3...partially.
- continue version (with gaussian kernel) ... partially.
- D trajectories, continue version
- Monitor likelihood in the Problem 2.