What is optimization

Optimization is about making choices that lead to first best outcomes. Mathematically, optimization is usually described with differential calculus, though when functions are linear, and problems are adequately fit substitution can lead to optimal solutions. In mathematics, optimization first requires the delineation of an objective function, a dependent function meant to be maximized or minimized through variations in the independent functions that predict it. Solving optimization problems therefore requires identifying maximum or minimum points on the objective function and qualifying those points as local or global maximums or minimums. Most computer solvers however do not solve optimization problems with calculus but rather use a variety of methods better described through the simplex analogy, or in the machine learning context, one of steps. That is to say, computers take a step toward a solution, then test to see if there are any closer alternatives, then step toward the solution again. Eventually, the computer concludes it is as close to the solution as possible and presents solutions. In computer optimization it is up to the programmer to understand if the solution is local or global solution.

How is optimization relevant?

Many modelling challenges can be described as optimization problems at the root. This is the case, for example, in many process engineering challenges where production input mixes constitute choice variables (decision variables) and are associated with different output levels. The same can be said for problems that seek to maximize return on investment, or minimize risk. Optimization mathematics and programming are an integral part of data science because they bridge data collection with organizational direction change. What is the point of collecting data if you do not use it to enhance your organization?