Amazon calculates a product’s star ratings based on a machine learned model instead of a raw data average. The model takes into account factors including the age of a rating, whether the ratings are from verified purchasers, and factors that establish reviewer trustworthiness.
So there is the old adage that if you give a man a fish he will eat once, but if you teach a man to fish he will eat forever. This book will definitely get you catching fish, but maybe leaves out how to clean and prepare the fish after it has been caught.
When flipping through the book through the preview feature, it looked like the book just went straight to the matter of explaining the algorithms (which is great) and giving examples of each algorithm written in Julia (more on this later). I have only read about half of the book so far, and I would say the material is written to get you up and going quickly with algorithms for optimization and have been impressed so far.
I will contrast this book to Nocedal and Wright (the only other optimization book that I own), and relate it to my opening paragraph. Nocedal and Wright is a really tough book to read. For better or worse, it focuses on some the excruciating details of many of the algorithms. There are many proofs, and generally does not deliver on giving something that you can code up quickly. This book will get you going quickly, but it skips much of nuanced formalism of Nocedal and Wright. This might mean that if some of the algorithms aren't working from this book, the explanations in this book may not be sufficient to make a truly robust solution for your problem. So in this regard, I see these two books complementing each other; one will deliver working pseudo code, and the other will provide a much more detailed description of the theory.
Another reason that I was interested in this book, is that its algorithms are written in Julia; a language that seems intriguing given what I do for a living, which is signal processing and algorithm development. Because of this, I do a lot of work in Matlab. Matlab is for the most part pretty good if you know how to vectorize operations, but as soon as you need to string for loops together, you take a significant time penalty. Julia borrows syntax and concepts from Python and Matlab, and is a JIT complied language as well. However, it complies supposedly to machine code and should run pretty quickly. So given this, it has piqued my interest on learning Julia. This books seemed like it would fulfill these two needs of mine in one book, so win-win.
Other feature of this book: - It looks like all the solutions for the end of chapter problems are solved: - It provides code for many standard functions used to test optimization algorithms - It provides a small tutorial for Julia in the appendix
It covers around 100 different algorithms for optimization. Probably more; I didn't count thoroughly. It describes algorithms and concepts with incredible clarity and extreme concision. It builds progressively from simple to complex. It provides all the background information needed beyond a basic calculus class and some basic background dealing with matrices and vectors. It provides code snippets written in Julia of all the algorithms. It includes exercises and answers. Other examples are presented throughout the text. It provides resources online that run using Jupyter notebook with a Julia kernel.
This book refreshed my memory and introduced me to so many topics. In particular, I found the sections on automatic differentiation, computational graphs, optimization under constraints, multiobjective optimization, surrogate models, sampling plans, and expression optimization to be enlightening and in some cases revolutionary to me. Like, OMG, you can do that? Over and over I thought, "I'll just skip this section. It seems irrelevant to what I need to learn." And each time I thought that, I'd start reading the section and would get hooked. Almost every section was highly relevant and provided building blocks for a deeper understanding. The book clarified so may ideas for me: function approximation, Lagrange multipliers and their extensions, duality, Pareto optimality, uses of quasi-random sequences, surrogate models, and probabilistic grammars. All of these ideas will be useful in my current projects.
Julia was new to me. This language seems to be able to represent many loop structures and iteration processes in extremely compact form. Downloading and installing it and all other Julia modules used by the book was straightforward (except the Vec package needed a bit more sleuthing to get).
Don't be fooled, though. This is an introductory text, and based on the preface, it appears to be intended for undergraduate-level courses. You will not find proofs of the results presented in the book - that is not the goal of the book. Margin notes provide relevant references from the primary (and secondary!) literature. For example, I had to look up more about probabilistic prototype trees and learning algorithms for these structures; it was a snap to find the relevant primary literature. The book's real strength is in the sheer number of algorithms described.
Despite the comprehensive coverage, not all topics I was expecting were covered. I was hoping for something about expectation maximization and other latent variable methods. I also was hoping for more information about optimization with decision trees. Also, MCMC was missing although some Monte Carlo approaches were described; usually, the book advocated other methods over Monte Carlo approaches for more efficient optimization. Granted, this book is not intended as a machine learning book that might cover these missing topics in more detail. (BTW, the methods in the book can certainly be applied to machine learning problems. )
The book sort of just ends. A final synthesis chapter that provides tables of the strengths, weaknesses, and areas of applicability of all the methods covered in the book, or a chapter outlining current challenges and areas of research, would be icing on the cake. The reader must make this synthesis themselves. Strengths and weaknesses are covered during the exposition of the various approaches, so this synthesis could be done with some discipline on the part of the reader.
I've been wanting to learn julia lang and optimization. Learned about this book from the julia language website. Great book! Really well made. I've been going through all the code examples to teach myself the methods and Julia at the same time. Such a great book, such a great language, and a fantastic combo.
As my university program was focused on totally different curriculum and I am working on optimization problems quite frequently I was looking for book like this - overview of algorithms and ideas in optimization, with short descriptions to fill missing basics. Book doesn't go deep into topics, it's very shallow, but at least you know what to search for and it has reference to papers which is helpful to get more details. This is really nice book and Julia code is also really helpful.
The content and format of this book are outstanding. It contains concepts that I vaguely understood. The book has clear and concise explanations for a wide range of optimization and data based modeling techniques.
This is book is really good. It describes the algorithms very clearly and with decent figures. In addition, it provides crash course in Julia language and the sample codes in the main text are all in Julia. Highly recommended!