Optimization for Machine Learning Finding Function Optima with Python by Jason Brownlee
- Publisher: COMPUTER SCIENCE
- Availability: In Stock
- SKU: 55576
- Number of Pages: 412
Rs.875.00
Rs.1,195.00
Tags: Artificial Intelligence , Cloud Computing , Computer Engineering , Cybersecurity , Data Science , Development , Digital Learning , Distance Education , E-Learning , Emerging Technologies , Finding Function , Finding Function Optima with Python , Information Technology , Internet-based Learning , IT , IT Careers , IT Infrastructure , IT Jobs , IT Management , IT News , IT Professionals , IT Security , IT Systems , IT Updates Computer Science , Machine Learning , Mobile App Development , Networking , Online Classes , Online Courses , Online Education , Online Learning , Online Resources , Online Training , Optima with Python , Optimization , Optimization for Machine Learning , Python , Remote Learning , Software , Tech , Virtual Learning , Web Development , Web-based Learning
Optimization for Machine Learning Finding Function Optima with Python
Optimization happens everywhere. Machine learning is one example of such and gradient descent is probably the most famous algorithm for performing optimization. Optimization means to find the best value of some function or model. That can be the maximum or the minimum according to some metric.
Using clear explanations, standard Python libraries, and step-by-step tutorial lessons, you will learn how to find the optimum point to numerical functions confidently using modern optimization algorithms.
Lessons Overview
Below is an overview of the 30 step-by-step tutorial lessons you will work through:
Each lesson was designed to be completed in about 30-to-60 minutes by an average developer.
Part 1: Foundation
- Lesson 01: What is Function Optimization
- Lesson 02: Optimization and Machine Learning
- Lesson 03: How to Choose an Optimization Algorithm
Part 2: Background
- Lesson 04: No Free Lunch Theorem for Machine Learning
- Lesson 05: Local Optimization vs. Global Optimization
- Lesson 06: Premature Convergence
- Lesson 07: Creating Visualization for Function Optimization
- Lesson 08: Stochastic Optimization Algorithms
- Lesson 09: Random Search and Grid Search
Part 3: Local Optimization
- Lesson 10: What is a Gradient in Machine Learning?
- Lesson 11: Univariate Function Optimization
- Lesson 12: Pattern Search: Nelder-Mead Optimization Algorithm
- Lesson 13: Second Order: The BFGS and L-BFGS-B Optimization Algorithms
- Lesson 14: Least Square: Curve Fitting with SciPy
- Lesson 15: Stochastic Hill Climbing
- Lesson 16: Iterated Local Search
Part 4: Global Optimization
- Lesson 17: Simple Genetic Algorithm from Scratch
- Lesson 18: Evolution Strategies
- Lesson 19: Differential Evolution
- Lesson 20: Simulated Annealing from Scratch
Part 5: Gradient Descent
- Lesson 21: Gradient Descent Optimization from Scratch
- Lesson 22: Gradient Descent with Momentum
- Lesson 23: Gradient Descent with AdaGrad
- Lesson 24: Gradient Descent with RMSProp
- Lesson 25: Gradient Descent with Adadelta
- Lesson 26: Adam Optimization Algorithm
Part 6: Projects
- Lesson 27: Use Optimization Algorithms to Manually Fit Regression Models
- Lesson 28: Optimize Neural Network Models
- Lesson 29: Feature Selection using Stochastic Optimization
- Lesson 30: Manually Optimize Machine Learning Model Hyperparameters
Appendix
- Appendix A: Getting help
- Appendix B: How to Setup Your Python Environment
You can see that each part targets a specific learning outcome, and so does each tutorial within each part. This acts as a filter to ensure you are only focused on the things you need to know to get to a specific result and do not get bogged down in the math or near-infinite number of digressions.
The tutorials were not designed to teach you everything there is to know about each of the theories or techniques. They were designed to give you an understanding of how they work, how to use them, and how to interpret the results the fastest way I know how: to learn by doing.
════ ⋆★⋆ ═══
Writer ✤ Jason Brownlee