Iteration | a | c | f(a) | f(c) |
---|
The Golden Cut Method is a derivative-free optimization algorithm used to find the minimum (or maximum) of a single variable function in a bounded interval. It is a classical method in numerical optimization that provides guaranteed convergence without requiring gradient information.
Many real-world optimization problems involve functions that are continuous and unimodal over a certain interval but are either expensive or impossible to differentiate. The golden section method provides a systematic way to narrow down the search space and find the extremum using only function evaluations.
The method is based on the principle of dividing the interval into two parts in such a way that the ratio of the lengths of the intervals is equal to the golden ratio. This ensures that each iteration reduces the search space while maintaining a consistent reduction ratio.
The algorithm uses the mathematical constant known as the golden ratio \( \phi \), defined as:
\[ \phi = \frac{1 + \sqrt{5}}{2} \approx 1.618, \quad \text{and its conjugate:} \quad \tau = 1 - \frac{1}{\phi} \approx 0.382 \]These constants are used to divide the interval in a way that maintains a consistent reduction ratio at each iteration, avoiding unnecessary function evaluations.
Suppose we want to minimize a function \( f(x) \) over the interval \([a, c]\). First compute two interior points \( b \) and \( d \) as shown below to satisfy the condition \( a < d < b < c\) :
\[ b = a + (1 - \tau)(c - a), \] \[ d = a + \tau(c - a) \]The function values at these points are \( f(b) \) and \( f(d) \). Depending on the function values, we can discard one of the subintervals:
\[ \text{If } f(b) < f(d) \text{, then set } c=d \] \[ \text{If } f(b)> f(d) \text{, then set } a = b \]This process is repeated until the interval \([a, c]\) is sufficiently small, at which point the minimum is approximated by the midpoint of the interval.
At each step, the method reduces the interval size by a factor of approximately \( \phi - 1 \approx 0.618 \), which guarantees convergence. It always reuses one of the previously evaluated points, minimizing computation.
The algorithm for the Golden Cut Method can be summarized as follows:
Consider the function \(f(x) = (x - 2)^2 + 1\) over the interval \([0, 4]\). The function is unimodal and has a minimum at \(x = 2\). Using the Golden Cut Method, we can find the minimum as follows:
The first iteration of the algorithm looks as below