Iteration | x | y | f(x, y) |
---|
Powell’s Method is a derivative-free optimization algorithm designed to minimize a multivariable function without computing gradients. It applies a sequence of one-dimensional minimizations (golden cut) along a set of parameter directions, iteratively refining the solution. It is an extension of the golden cut method to multiple dimensions.
Consider a function \( f: \mathbb{R}^n \to \mathbb{R} \) that we want to minimize. As every optimization method, Powell's method also requires an initial guess \( X_0 \). The dimension of input space is \( n \), and the initial guess \(X_0 \) is a vector in \( \mathbb{R}^n \). Powell method minimizes the function \( f \) by individually minizing the function along one parameter direction at a time. The sequenctial minimization in each parameter direction continues until the convergence criteria are met.
In order to understnad the powell algorithm well, 2 dimensional function is considered. The prametric directions are \(x\) and \(y\). The algorithm can be generalized to higher dimensions by adding more parameter directions. Let assume that at certain iteration \(k\), the current point is \(X_k = [x_k, y_k]\). While optimizing the function value in direction \(x\), the algorithm found an optimal value, \(x^*\) the optimal point is updated to \(X_{k+1} = [x^*, y_k]\). Then the algorithm optimizes the function value in direction \(y\) with the updated point \(X_{k+1}\). The precess continuous until the convergence criteria are met or the maximum number of iterations is reached. The algorithm can be summarized as follows: