Iteration x y f(x, y)

Powell Method

Powell’s Method is a derivative-free optimization algorithm designed to minimize a multivariable function without computing gradients. It applies a sequence of one-dimensional minimizations (golden cut) along a set of parameter directions, iteratively refining the solution. It is an extension of the golden cut method to multiple dimensions.

2. Method

Consider a function \( f: \mathbb{R}^n \to \mathbb{R} \) that we want to minimize. As every optimization method, Powell's method also requires an initial guess \( X_0 \). The dimension of input space is \( n \), and the initial guess \(X_0 \) is a vector in \( \mathbb{R}^n \). Powell method minimizes the function \( f \) by individually minizing the function along one parameter direction at a time. The sequenctial minimization in each parameter direction continues until the convergence criteria are met.

3. Algorithm

In order to understnad the powell algorithm well, 2 dimensional function is considered. The prametric directions are \(x\) and \(y\). The algorithm can be generalized to higher dimensions by adding more parameter directions. Let assume that at certain iteration \(k\), the current point is \(X_k = [x_k, y_k]\). While optimizing the function value in direction \(x\), the algorithm found an optimal value, \(x^*\) the optimal point is updated to \(X_{k+1} = [x^*, y_k]\). Then the algorithm optimizes the function value in direction \(y\) with the updated point \(X_{k+1}\). The precess continuous until the convergence criteria are met or the maximum number of iterations is reached. The algorithm can be summarized as follows:

  1. Initialize:
    • Define the initial point \( X_0 = [x_0, y_0] \) and the function \( f(X) \).
    • Determine search intervals along each axis based on the magnitude of \( x_0 \):
    • Define number of iterations \( max\_iter \) and tolerance level \( \epsilon \).
  2. Iterate \(k\) ( \( max\_iter \) steps):
    • Evaluate the function value \( f_k = f(X) \).
    • First direction optimization (x-axis):
      • Minimize \( f(X) \) along the x-direction using golden section search.
      • Update the optimal value \( X\)
      • Check for convergence: if yes, stop.
    • Second direction optimization (y-axis):
      • Repeat golden section search along the y-direction with the updated \( X \).
      • Update \( X \)
      • Check for convergence: if yes, stop.
  3. Return: The final optimal value \( X \)

4. Discussions