Published in September 2012 by Cambridge University Press.

[ CUP (offering 20% discount on list price) | Google Books ]

- (December 2014) Now more than 5,000 copies sold!
- (December 2013) Full set of lecture slides in PDF and LaTeX beamer

This book is an introductory text on machine learning. The style of the book is such that it can be used as a textbook for an advanced undergraduate or graduate course, at the same time aiming at interested academics and professionals with a background in neighbouring disciplines. The material includes necessary mathematical detail, but emphasises intuitions and how-to.

The challenge in writing an introductory machine learning text is to do justice to the incredible richness of the machine learning field without losing sight of the unifying principles. One way in which this is achieved in this book is by separate and extensive treatment of tasks and features, both of which are common across any machine learning approaches. Covering a wide range of logical, geometric and statistical models, the book is one of the most comprehensive machine learning texts around.

For excerpts and lecture slides click here; also see the Table of Contents below.

- Excellent introductory text with significant depth for professionals: future classic...
- The author does an *exceptionally* good job using graphics to build intuitions about what algorithms are used for and how they work...
- What an amazing book, I got it about a month ago for a self-study routine and every page of this book has been a joy...
- When it comes to Machine Learning, it is usually difficult for me to really understand the science at a deep level. This book changed that for me...
- One bright spot in the otherwise pitch black landscape of machine learning books...

- Prologue: A machine learning sampler
- 1 The ingredients of machine learning
- 2 Binary classification and related tasks
- 3 Beyond binary classification
- 4 Concept learning
- 5 Tree models
- 6 Rule models
- 7 Linear models
- 8 Distance-based models
- 9 Probabilistic models
- 10 Features
- 11 Model ensembles
- 12 Machine learning experiments
- Epilogue: Where to go from here
- Important points to remember
- Bibliography
- Index

algorithm association average beak binary boundary called case class classification classifier clustering
consider
coverage
covered
curve
data
decision
descriptive different distance e-mail estimates example feature function general gills given gives grouping ham instance item labelled leaf learning length linear list loss lottery machine matrix means models negative number obtain order parameters
performance points positive possible predictive probabilistic probability
problem
ranking
regression result roc rule scores segment sets space spam tasks teeth terms test threshold training tree true used values variables vector viagra words yes

created at TagCrowd.com

Please use this link to submit feedback.

Last change: 8 May, 2015