Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You've probably heard of Hidden Markov Models. They're widely used in many machine learning applications. An HMM is just a simple, easy-to-compute Bayesian inference network.

The idea of Markov chains predates Pearl. His work was a demonstration of the accuracy and power of Bayesian inference networks. He revitalized that idea at a time where programmers were just beginning to have the data and processing power to apply machine learning.

The rest is history.



While you are correct of HMMs as an example of a Bayesian network, the essence of Pearl's foundational contributions lie in: http://en.wikipedia.org/wiki/Belief_propagation and http://en.wikipedia.org/wiki/Markov_blanket.

Bayesian Networks tend to be used not as classifiers but a tool to explore joint probability distributions.

Interestingly related to your topic, HMMs and Naive Bayes are related in that HMMs are kinda like the sequence sensitive version. HMMs and Naive Bayes are generative models. They both model/estimate a joint probability on the data with very strong conditional independence assumptions. Where as Logistic regression and Conditional Random Fields estimate the conditional probability of the output/labels directly.

HMMs : Naive Bayes as linear chain CRFs : Logistic Regression. CRFs are state of the art at sequence and time series prediction. I have not yet gotten my head round them though. The relationship between logistic regression and naive bayes is not commonly known (although the comparison of log reg to a simple Neural network is common). Knowing when Logistic regression outperforms Naive Bayes is useful (simple rule of thumb: logistic regression less sensitive to independence assumption, more data use log reg, less data use naive bayes). I've implemented a multi class sparse regularized logistic: SMLR. Its up there with linear SVMs but simpler but also gives a probability.


PG's old spam filter (http://www.paulgraham.com/spam.html) is another example of applied Bayesian inference. Not directly related to Pearl's work, but he basically deserves credit for the wider renaissance in Bayesian techniques.


> Not directly related to Pearl's work, but he basically deserves credit for the wider renaissance in Bayesian techniques.

Post hoc ergo propter hoc? The success of naive Bayes classifiers for spam filtering must have improved popular perceptions of Bayesian techniques, but let us not go overboard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: