Skip to content

ncble/Probabilistic_Graphical_Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Probabilistic_Graphical_Models

python implementation of classic probabilistic graphical models

HMM (Hidden Markov Model)

I implement a tensor version (using numpy) of HMM for three classic problems of HMM. For any non-sequential/non-recursive, I bypass the for-loop by using tensor operations. Thus, my numpy implementation could be easily adapted by tensorflow, pytorch.

  1. The Evaluation problem: (forward/backward messages) Given πœ†=(πœ‹,𝐴,𝐡) and a sequence of observation 𝑂=(π‘œ1,π‘œ2,π‘œπ‘‡) , estimate P(𝑂|πœ†).

  2. The Learning problem: (Baum-Welch algorithm, aka EM-algorithm) Given 𝑂=(π‘œ1,π‘œ2,π‘œπ‘‡) , estimate best fit parameters πœ†=(πœ‹,𝐴,𝐡).

  3. The Decoding problem: (Viterbi's algorithm) Given πœ†=(πœ‹,𝐴,𝐡) and a sequence of observation 𝑂=(π‘œ1,π‘œ2,π‘œπ‘‡) , estimate hidden states.

Please see directly the jupyter notebook HMM.ipynb for more details.

Project ongoing: (check list)

  • tensor version of basic HMM ...done.
  • discrete observation space...done.
  • D trajectories version of HMM...done.
  • Problem 1, 2...done.
  • Problem 3...partially.
  • continue version (with gaussian kernel) ... partially.
  • D trajectories, continue version
  • Monitor likelihood in the Problem 2.

About

python implementation of classic probabilistic graphical models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published