## Customer reviews

*4.7 out of 5 stars*

460 customer ratings

Format: HardcoverChange

Price:$58.12+ Free shipping with Amazon Prime

Skip to main content

## Customer reviews

*4.7 out of 5 stars*

#### Top positive review

*5.0 out of 5 stars*This is the easy book from Hastie, et al. on Statistical Learning (Machine Learning)
#### Top critical review

*3.0 out of 5 stars*Serves its purpose, but please do not learn R through this text

#### There was a problem filtering reviews right now. Please try again later.

*5.0 out of 5 stars*
This is the easy book from Hastie, et al. on Statistical Learning (Machine Learning)
Reviewed in the United States on December 16, 2017
22 comments
Report abuse

- An Introduction to Statistical Learning: with Applications in R...
- ›
- Customer reviews

460 customer ratings

Format: HardcoverChange

Price:$58.12+ Free shipping with Amazon Prime

Reviewed in the United States on December 16, 2017

In 2009, Stanford Statistics professors Hastie/Tibshirani/Friedman wrote 'The Elements of Statistical Learning', a book that demands a Master's or Doctoral level knowledge of Mathematical Statistics. Years ago, as a part of earning my MS Mathematics, I passed a doctoral-level qualifying examination in Mathematical Statistics. But that was years ago and I needed a friendly refresher before reading 'Elements', which is gathering dust on my shelf.

Well, I'm lucky (and probably so are you) because in 2013 Stanford Statistics professors James/Witten/Hastie/Tibshirani wrote this simpler 'An Introduction to Statistical Learning' that requires only a Bachelor's degree in Mathematics or Statistics. If you have that math grounding, then this is a wonderful book to start your Statistical Learning. The book offers a clear application of Mathematical Statistics and the programming language R to Statistical Learning. At the end of each chapter, the authors provide 10-15 questions to test whether you've digested the material.

Only a few times have I needed to review my Hogg/Craig 'Introduction to Mathematical Statistics'. If you want an excellent book on Mathematical Statistics to prepare you for both 'Introduction to Statistical Learning' and 'The Elements of Statistical Learning', buy the 7th edition of 'Introduction to Mathematical Statistics' by Hogg/McKean/Craig, which is typically used for a year-long (2 semesters) class for 1st or 2nd year graduate students in Mathematics or Statistics. In fact, you could simply bone up on Hogg/McKean/Craig, skip 'Introduction to Statistical Learning', and go straight to the more challenging 'Elements of Statistical Learning'. I wanted to digest some Statistical Learning asap and probably so will you. Enjoy.

Well, I'm lucky (and probably so are you) because in 2013 Stanford Statistics professors James/Witten/Hastie/Tibshirani wrote this simpler 'An Introduction to Statistical Learning' that requires only a Bachelor's degree in Mathematics or Statistics. If you have that math grounding, then this is a wonderful book to start your Statistical Learning. The book offers a clear application of Mathematical Statistics and the programming language R to Statistical Learning. At the end of each chapter, the authors provide 10-15 questions to test whether you've digested the material.

Only a few times have I needed to review my Hogg/Craig 'Introduction to Mathematical Statistics'. If you want an excellent book on Mathematical Statistics to prepare you for both 'Introduction to Statistical Learning' and 'The Elements of Statistical Learning', buy the 7th edition of 'Introduction to Mathematical Statistics' by Hogg/McKean/Craig, which is typically used for a year-long (2 semesters) class for 1st or 2nd year graduate students in Mathematics or Statistics. In fact, you could simply bone up on Hogg/McKean/Craig, skip 'Introduction to Statistical Learning', and go straight to the more challenging 'Elements of Statistical Learning'. I wanted to digest some Statistical Learning asap and probably so will you. Enjoy.

73 people found this helpful

Reviewed in the United States on December 2, 2018

I think this textbook does well with providing basic intuitions of algorithms to those who do not have a strong math background, but I don't appreciate the quality of the R code. (My criticism has nothing with avoiding modern paradigms, such as the tidyverse.)

As one example, I have established as a personal practice that I will never use the subset argument of lm(), even though it is used throughout this entire text. Why is this? I was curious one day, and decided to compare subsetting the data argument, versus putting the indices inside the subset argument.

It turns out that in both cases, I obtained a different result. (See StackOverflow, with q/46939063/ appended to the link.) After asking around on Cross Validated as well (q/309931 appended to the URL of Cross Validated), I concluded that using the subset argument of lm() was bad advice.

Now, in prediction, this issue doesn't occur. But if you're planning on using lm() to interpret parameter estimates, don't follow this textbook's advice.

As one example, I have established as a personal practice that I will never use the subset argument of lm(), even though it is used throughout this entire text. Why is this? I was curious one day, and decided to compare subsetting the data argument, versus putting the indices inside the subset argument.

It turns out that in both cases, I obtained a different result. (See StackOverflow, with q/46939063/ appended to the link.) After asking around on Cross Validated as well (q/309931 appended to the URL of Cross Validated), I concluded that using the subset argument of lm() was bad advice.

Now, in prediction, this issue doesn't occur. But if you're planning on using lm() to interpret parameter estimates, don't follow this textbook's advice.

7 people found this helpful

Search

Sort by

Top rated

Filter by

All reviewers

All stars

All formats

Text, image, video

Showing 1-10 of 232 reviews

In 2009, Stanford Statistics professors Hastie/Tibshirani/Friedman wrote 'The Elements of Statistical Learning', a book that demands a Master's or Doctoral level knowledge of Mathematical Statistics. Years ago, as a part of earning my MS Mathematics, I passed a doctoral-level qualifying examination in Mathematical Statistics. But that was years ago and I needed a friendly refresher before reading 'Elements', which is gathering dust on my shelf.

Well, I'm lucky (and probably so are you) because in 2013 Stanford Statistics professors James/Witten/Hastie/Tibshirani wrote this simpler 'An Introduction to Statistical Learning' that requires only a Bachelor's degree in Mathematics or Statistics. If you have that math grounding, then this is a wonderful book to start your Statistical Learning. The book offers a clear application of Mathematical Statistics and the programming language R to Statistical Learning. At the end of each chapter, the authors provide 10-15 questions to test whether you've digested the material.

Only a few times have I needed to review my Hogg/Craig 'Introduction to Mathematical Statistics'. If you want an excellent book on Mathematical Statistics to prepare you for both 'Introduction to Statistical Learning' and 'The Elements of Statistical Learning', buy the 7th edition of 'Introduction to Mathematical Statistics' by Hogg/McKean/Craig, which is typically used for a year-long (2 semesters) class for 1st or 2nd year graduate students in Mathematics or Statistics. In fact, you could simply bone up on Hogg/McKean/Craig, skip 'Introduction to Statistical Learning', and go straight to the more challenging 'Elements of Statistical Learning'. I wanted to digest some Statistical Learning asap and probably so will you. Enjoy.

Well, I'm lucky (and probably so are you) because in 2013 Stanford Statistics professors James/Witten/Hastie/Tibshirani wrote this simpler 'An Introduction to Statistical Learning' that requires only a Bachelor's degree in Mathematics or Statistics. If you have that math grounding, then this is a wonderful book to start your Statistical Learning. The book offers a clear application of Mathematical Statistics and the programming language R to Statistical Learning. At the end of each chapter, the authors provide 10-15 questions to test whether you've digested the material.

Only a few times have I needed to review my Hogg/Craig 'Introduction to Mathematical Statistics'. If you want an excellent book on Mathematical Statistics to prepare you for both 'Introduction to Statistical Learning' and 'The Elements of Statistical Learning', buy the 7th edition of 'Introduction to Mathematical Statistics' by Hogg/McKean/Craig, which is typically used for a year-long (2 semesters) class for 1st or 2nd year graduate students in Mathematics or Statistics. In fact, you could simply bone up on Hogg/McKean/Craig, skip 'Introduction to Statistical Learning', and go straight to the more challenging 'Elements of Statistical Learning'. I wanted to digest some Statistical Learning asap and probably so will you. Enjoy.

73 people found this helpful

Helpful

Reviewed in the United States on February 13, 2014

This is a wonderful book written by luminaries in the field. While it is not for casual consumption, it is a relatively approachable review of the state of the art for people who do not have the hardcore math needed for The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics). This book is the text for the free Winter 2014 MOOC run out of Stanford called StatLearning (sorry Amazon will not allow me to include the website). Search for the class and you can watch Drs. Hastie and Tibshirani teach the material in this book.

Reviewed in the United States on June 4, 2017

This book will not help you understand the ESL book (Elements of Statistical Learning).

If you are already programming ML a lot and you want to step up your ML math but find ESL too hard because it is not self-contained and uses too much graduate stats terminology then do not fall for the reviewers that recommend reading ISL (Introduction to Statistical Learning) instead. ISL does not contain explanations missing from ESL. In fact, it does not explain math at all, but instead, it gives a very broad overview of statistical methods that overlap with ML.

Then who is this book for? This book is for someone who juuust started learning ML, like completed the coursera ML course or started using Python scikit-learn.

The book is well-written though. It is not self-contained because it does not explain math but merely gives a minimum intuition behind it.

If you are already programming ML a lot and you want to step up your ML math but find ESL too hard because it is not self-contained and uses too much graduate stats terminology then do not fall for the reviewers that recommend reading ISL (Introduction to Statistical Learning) instead. ISL does not contain explanations missing from ESL. In fact, it does not explain math at all, but instead, it gives a very broad overview of statistical methods that overlap with ML.

Then who is this book for? This book is for someone who juuust started learning ML, like completed the coursera ML course or started using Python scikit-learn.

The book is well-written though. It is not self-contained because it does not explain math but merely gives a minimum intuition behind it.

Reviewed in the United States on October 24, 2013

The book provides the right amount of theory and practice, unlike the earlier (venerable and, by now, stable) text authored (partly) by the last two authors of this one (Elements of Statistical Learning), which was/is a little heavy on the theoretical side (at least for practitioners without a strong mathematical background). The authors make no pretense about this either. The Preface says "But ESL is intended for individuals with advanced training in the mathematical sciences. An Introduction to Statistical Learning (ISL) arose from the perceived need for a broader and less technical treatment of these topics."

ISL is neither as comprehensive nor as in-depth as ESL. It is, however, an excellent introduction to Learning due to the ability of the authors to strike a perfect balance between theory and practice. Theory is there to aim the reader as to understand the purpose and the "R Labs" at the end of each chapter are as valuable (or perhaps even more) than the end-of-chapter exercises.

ISL is an excellent choice for a two-semester advanced undergraduate (or early graduate) course, practitioners trained in classical statistics who want to enter the Learning space, and seasoned Machine Learners. It is especially helpful for getting the fundamentals down without being bogged down in heavy mathematical theory, a great way to kick-off corporate Learning units, or as an aid to help statisticians and learners communicate better.

A needed and welcome addition to the Learning literature, authored by some of the most well respected names in industry and academia. A classic in the making. Recommended unreservedly.

____________________________________________

UPDATE (12/17/2013): Two of the authors (Hastie & Tibshirani) are offering a 10-week free online course (StatLearning: Statistical Learning) based on this book found at Stanford University's Web site (Starting Jan. 21, 2014). They also say that "As of January 5, 2014, the pdf for this book will be available for free, with the consent of the publisher, on the book website." Amazing opportunity! Enjoy!

____________________________________________

UPDATE (04/03/2014): I took the course above and found it very helpful and insightful. You don't need the course to understand the book. If anything, the course videos are less detailed than the book. It is certainly nice, though, to see the actual authors explain the material. Also, the interviews by Efron and Friedman were a nice touch. The course will be offered again in the future.

ISL is neither as comprehensive nor as in-depth as ESL. It is, however, an excellent introduction to Learning due to the ability of the authors to strike a perfect balance between theory and practice. Theory is there to aim the reader as to understand the purpose and the "R Labs" at the end of each chapter are as valuable (or perhaps even more) than the end-of-chapter exercises.

ISL is an excellent choice for a two-semester advanced undergraduate (or early graduate) course, practitioners trained in classical statistics who want to enter the Learning space, and seasoned Machine Learners. It is especially helpful for getting the fundamentals down without being bogged down in heavy mathematical theory, a great way to kick-off corporate Learning units, or as an aid to help statisticians and learners communicate better.

A needed and welcome addition to the Learning literature, authored by some of the most well respected names in industry and academia. A classic in the making. Recommended unreservedly.

____________________________________________

UPDATE (12/17/2013): Two of the authors (Hastie & Tibshirani) are offering a 10-week free online course (StatLearning: Statistical Learning) based on this book found at Stanford University's Web site (Starting Jan. 21, 2014). They also say that "As of January 5, 2014, the pdf for this book will be available for free, with the consent of the publisher, on the book website." Amazing opportunity! Enjoy!

____________________________________________

UPDATE (04/03/2014): I took the course above and found it very helpful and insightful. You don't need the course to understand the book. If anything, the course videos are less detailed than the book. It is certainly nice, though, to see the actual authors explain the material. Also, the interviews by Efron and Friedman were a nice touch. The course will be offered again in the future.

Reviewed in the United States on January 24, 2018

This is a wonderful book for an intro to the world of statistical learning. As an engineering students, it is very approachable and readable. It took me 2 days to finish all chapters, without exercise. To read through the chapters, it's much more enjoyable than reading other math/stat books, since the ideas behind each model or algorithms are very clear even intuitive, a lot of well-made plots make the understanding even easier. I would like to recommend to anyone who want to enter the world of statistical learning.

However, from a graduate level student, I would say this book is more suitable for a undergrad stat or related field student, practitioners, or an entry level graduate student who is not majoring in stat or math. The ideas are much more intuitive than rigorous. If only use such book to do any real world problem, even though they talk about cross validation or something a little bit involved, practitioners may either came across so much problems in statistical analysis, or come to a wrong conclusion. Not saying the methods within this book is wrong, but without deep understanding of some theories or rigorous assumpions of the methods, pure blind trying different algorithms to find lowest MSE may not be suitable for some cases.

Still, this is a wonderful book for two cases:

1. If you have some background in theoretical or mathematical statistics and want to gain some knowledge of applied methods, this book will be wonderful for you to find applications with your theoretical knowledge;

2. If you have few knowledge about rigorous statistics, but want to enter the world of statistical/machine learning, this one is very suitable to trigger your interest for reading deeper and more rigorous books, such as ESL.

For myself, this books is more like a ticket. I have the ticket of a beautiful state park. I use it to cross the gate of the park, but stand near the gate to give an overlook of the beautiful scenes of the park. The map described on the ticket is only contained the main road of the park. If you want to check more beautiful scenes, you need more work, more tickets, more tools to take an adventure within this park for quite a while.

However, from a graduate level student, I would say this book is more suitable for a undergrad stat or related field student, practitioners, or an entry level graduate student who is not majoring in stat or math. The ideas are much more intuitive than rigorous. If only use such book to do any real world problem, even though they talk about cross validation or something a little bit involved, practitioners may either came across so much problems in statistical analysis, or come to a wrong conclusion. Not saying the methods within this book is wrong, but without deep understanding of some theories or rigorous assumpions of the methods, pure blind trying different algorithms to find lowest MSE may not be suitable for some cases.

Still, this is a wonderful book for two cases:

1. If you have some background in theoretical or mathematical statistics and want to gain some knowledge of applied methods, this book will be wonderful for you to find applications with your theoretical knowledge;

2. If you have few knowledge about rigorous statistics, but want to enter the world of statistical/machine learning, this one is very suitable to trigger your interest for reading deeper and more rigorous books, such as ESL.

For myself, this books is more like a ticket. I have the ticket of a beautiful state park. I use it to cross the gate of the park, but stand near the gate to give an overlook of the beautiful scenes of the park. The map described on the ticket is only contained the main road of the park. If you want to check more beautiful scenes, you need more work, more tickets, more tools to take an adventure within this park for quite a while.

Reviewed in the United States on December 2, 2018

I think this textbook does well with providing basic intuitions of algorithms to those who do not have a strong math background, but I don't appreciate the quality of the R code. (My criticism has nothing with avoiding modern paradigms, such as the tidyverse.)

As one example, I have established as a personal practice that I will never use the subset argument of lm(), even though it is used throughout this entire text. Why is this? I was curious one day, and decided to compare subsetting the data argument, versus putting the indices inside the subset argument.

It turns out that in both cases, I obtained a different result. (See StackOverflow, with q/46939063/ appended to the link.) After asking around on Cross Validated as well (q/309931 appended to the URL of Cross Validated), I concluded that using the subset argument of lm() was bad advice.

Now, in prediction, this issue doesn't occur. But if you're planning on using lm() to interpret parameter estimates, don't follow this textbook's advice.

As one example, I have established as a personal practice that I will never use the subset argument of lm(), even though it is used throughout this entire text. Why is this? I was curious one day, and decided to compare subsetting the data argument, versus putting the indices inside the subset argument.

It turns out that in both cases, I obtained a different result. (See StackOverflow, with q/46939063/ appended to the link.) After asking around on Cross Validated as well (q/309931 appended to the URL of Cross Validated), I concluded that using the subset argument of lm() was bad advice.

Now, in prediction, this issue doesn't occur. But if you're planning on using lm() to interpret parameter estimates, don't follow this textbook's advice.

Reviewed in the United States on February 15, 2018

Great book! I don't have math/statistics/computer programming background. I have been supplementing it with my Statistics course with R. This book is very thorough in explaining the concepts. It has outstanding explanations of "how" and "why". For example, it explains in good detail the summary output in R, what to expect from a good model vs. a bad model and why. I've searched for these explanation on-line and the book blows them all. Explanations in the book are life based and not theoretical, which I usually lack from the classes and dry stats books. After reading this book it finally comes together for me. Now I don't punch R code blindly, I know what to expect from the output. What is also nice, is that the book does not overload you with formulas and proofs that mean nothing for a non-math major. Very glad I invested this money to buy it.

Reviewed in the United States on November 18, 2018

I love this book. I read it all, did the labs and thought through the exercises (i.e., I didn't do every one of them); and I will likely end up re-reading guidance provided on some of the topics several times. By "Goldilocks", I mean this book provides a level of explanation, mathematical basis and practical consideration that is "just right" for where I sit on the continuum of data science practitioners. As an expert analyst, I need to understand which concepts, models and algorithms are applicable given the business objective and constraints (e.g. available data, time frame and forum in which to provide insight). I also need to understand what assumptions I am making when I choose an approach, as well as the tradeoffs. We know there are always tradeoffs!

That is the level of understanding this book provides for all approaches covered. Where the authors provide the mathematical basis for an approach, they specify the model; and they stop short of *proving* the model. This, for me, is ideal.

About a year or so back, I began reading Elements of Statistical Learning, and I could not help but think "Do I really need to know 'this' at 'that' level to accomplish my goals?" When I discovered this book, Introduction to Statistical Learning (thanks to Amazon recommendations), I knew I'd found Goldilocks :-)

That is the level of understanding this book provides for all approaches covered. Where the authors provide the mathematical basis for an approach, they specify the model; and they stop short of *proving* the model. This, for me, is ideal.

About a year or so back, I began reading Elements of Statistical Learning, and I could not help but think "Do I really need to know 'this' at 'that' level to accomplish my goals?" When I discovered this book, Introduction to Statistical Learning (thanks to Amazon recommendations), I knew I'd found Goldilocks :-)

Reviewed in the United States on April 19, 2018

For those with a few solid courses in statistics, this is the kind of book that can patch holes and lead to a solid foundation necessary to dive into machine learning and data science. I would strongly recommend it as both a desk reference and (re) training tool.

The kind of book that I would recommend anyone know back to back before considering themselves "seasoned".

A few negatives, the writing isn't perfect, especially in the examples. R has also evolved since publication. It is better to think of the code snippets as the general idea, and possibly seek outside help if you have trouble implementing or even interpreting some of them.

I can see how the way code is displayed could be a disservice to beginners, which is unfortunate, as otherwise the book is both accessible and comprehensive.

The kind of book that I would recommend anyone know back to back before considering themselves "seasoned".

A few negatives, the writing isn't perfect, especially in the examples. R has also evolved since publication. It is better to think of the code snippets as the general idea, and possibly seek outside help if you have trouble implementing or even interpreting some of them.

I can see how the way code is displayed could be a disservice to beginners, which is unfortunate, as otherwise the book is both accessible and comprehensive.

Reviewed in the United States on December 29, 2016

ISLR is possibly the best book I've ever encountered on the subject of statistical learning. Throughout my studies, I spent countless days reading more complicated theoretical texts, but few have stuck, and I've never fully understood how to translate the theory to code (specifically, in R). This is probably because my background in statistics was limited to a single prerequisite undergraduate course. I have been waiting a decade to find a book like this, containing basic theory (with plenty of figures), some math, code snippets and reproducible examples. If you are completing a graduate degree in any field where data is collected en masse, I *strongly* recommend this book. I think graduate advisors should make it required reading in the first year of graduate school for students who will be performing computational data analysis.

ISLR teaches basic regression techniques for prediction and classification and formally explains sampling methods (cross-validation, bootstrapping). The figures are great and the code examples make it very easy to apply the lessons in your own studies. This book formalizes what took me years to learn by diffusion watching seminars and reading papers and blogs. 5 stars!

Bear in mind that this book is not a solid introduction to the R programming language. To learn R, study online courses, the "swirl" package. Once you've mastered the basics, look for the book "Advanced R".

ISLR teaches basic regression techniques for prediction and classification and formally explains sampling methods (cross-validation, bootstrapping). The figures are great and the code examples make it very easy to apply the lessons in your own studies. This book formalizes what took me years to learn by diffusion watching seminars and reading papers and blogs. 5 stars!

Bear in mind that this book is not a solid introduction to the R programming language. To learn R, study online courses, the "swirl" package. Once you've mastered the basics, look for the book "Advanced R".

- ←Previous page
- Next page→

There's a problem loading this menu right now.

Get fast, free delivery with Amazon Prime

Prime members enjoy FREE Two-Day Delivery and exclusive access to music, movies, TV shows, original audio series, and Kindle books.

Back to top

Make Money with Us

Amazon Payment Products

- Conditions of Use
- Privacy Notice
- Interest-Based Ads
- © 1996-2020, Amazon.com, Inc. or its affiliates