Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For clarification, Murphy's first book is just Machine Learning: A probabilistic perspective this is his newest, 2 volume book, Probabilistic Machine Learning which is broken down into two parts an Introduction (published March 1, 2022) and Advanced Topics (expected to be published in 2023, but draft preview available now).

To answer your question. This book is even more complete and a bit improved over the first book. I don't believe there's anything in Machine Learning that isn't well covered, or correctly omitted from Probabilistic Machine Learning. This also has the benefit of a few more years of rethinking these topics. So between the existing Murphy books, Probabilistic Machine Learning: an Introduction is probably the one you should have.

Why this over Bishop (which I'm not sure is the case)? While on the surface they are very similar (very mathematical overviews of ML from a very probability focused perspective) they function as very different books. Murphy is much more of a reference to contemporary ML. If you want to understand how most leading researchers think about and understand ML, and want a reference covering the mathematical underpinnings this is a book you really need for a reference.

Bishop is a much more opinionated book in that Bishop isn't just listing out all possible ways of thinking about a problem, but really building out a specific view of how probability relates to machine learning. If I'm going to sit down and read a book, it's going to be Bishop because he has a much stronger voice as an author and thinker. However Bishop's book is now more than 10 years old an misses out on nearly all of the major progress we've seen in deep learning. That's a lot to be missing and it won't be rectified in Bishop's perpetual WIP book [0.]

A better comparison is not Murphy to Murphy or Murphy to Bishop, but Murphy to Hastie et al. The Elements of Statistical Learning for many years was the standard reference for advanced ML stuff, especially during the brief time when GBDT and Random Forests where the hot thing (which they still are to an extent in some communities). I really enjoy EoSL but it does have a very "Stanford Statistics" (which I feel is even more aggressively Frequentist than your average Frequentist) feel to the intuitions. Murphy is really the contemporary computer science/Bayesian understanding of ML that has dominated the top research teams for the last few years. It feels much more modern and should be the replacement reference text for most people.

0. https://www.mbmlbook.com/



I read TESL during my Master's and I remember being very confused with the way it described decision tree learning. I remember being pleased with myself that I had a strong grip on decision tree learning before reading TESL and then being thoroughly confused after reading about them on TESL.

Eyballing the relevant chapter again (9.2) I think that may have been because it introduces decision tree learning with CART (the algorithm), whereas I was more familiar with ID3 and C4.5. Perhaps it's simpler to describe CART as TESL does, but decision trees are a propositional logic "model" (in truth, a theory) and for me the natural way to describe them, is as a propositional logic "model" (theory). I also get the feeling that Quinlan's work is sidelined a little, perhaps because he was coming from a more classical AI background and that's poo-poo'd in statistical learning circles. If so, that's a bit of a shame and a bit of an omission. Machine learning is not just statistics and it's not just AI, it's a little bit of both and one needs to have at least some background in both subjects to understand what's really going on. But perhaps it's the data mining/ data science angle that I find a bit one-sided.

Sorry to digress. I'm so excited when people discuss actual textbooks on HN.


I’m in agreement with much of your post. The Elements of Statistical Learning played its role quite well years ago but a fresher take is needed. Thanks for the response.


Echoing others, thank you for writing this (as someone doing an applied math masters and digging into ML - I have used ESL for a class but not the others you mention)


Comments like these are why I come to HN. Thank you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: