# Fibonacci method

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

A variety of processes of one-dimensional search for an extremum of a function by means of successively narrowing the interval of uncertainty. The sole restriction imposed on the function in question

$$f ( x) \ \ ( x \in ( a _ {0} , b _ {0} ) \subset \mathbf R ^ {1} )$$

is the requirement of strict unimodality on the given interval $( a _ {0} , b _ {0} )$.

In the successive narrowing process the values of $f ( x)$ are computed (or measured) at a number $n$ of test points that is bounded beforehand. As a result one obtains a sequence of narrowing intervals of uncertainty containing the required extremum:

$$( a _ {0} , b _ {0} ) \supset \dots \supset ( a _ {n} , b _ {n} ).$$

In order to narrow the interval of uncertainty for an arbitrary strictly-unimodal function, one needs to know not less than two test values of it. In the Fibonacci method one chooses exactly two test values inside each current interval of uncertainty, symmetrically about the middle of the interval. Next, along with one of the test points one discards the end of the interval with the worst values of $f ( x)$. One obtains $( a _ {i + 1 } , b _ {i + 1 } )$, where in addition to the remaining old test point one symmetrically constructs a new one. Hence for the sequence of intervals

$$\Delta _ {i} = \ b _ {i} - a _ {i}$$

follows the recurrence relation

$$\Delta _ {i} = \ \Delta _ {i + 1 } + \Delta _ {i + 2 } .$$

(It is also assumed above that the overlap condition $\Delta _ {1} < ( 2/3) \Delta _ {0}$ is satisfied.)

The solution to the equation under the condition $\Delta _ {n + 1 } = \Delta _ {n + 2 }$ gives

$$\Delta _ {i} = \ \frac{F _ {n + 3 - i } }{F _ {n + 2 } } \Delta _ {0} ,\ \ i = 2 \dots n,$$

where $F _ {k}$ are the Fibonacci numbers.

For the extremum point one finds $x _ { \mathop{\rm opt} } \approx ( a _ {n} + b _ {n} )/2$.

In the simplest version of the Fibonacci method (when one assumes that the test points and test values $f ( x)$ are determined absolutely accurately), in order to narrow the original interval of uncertainty from $\Delta$ to $\epsilon$ one needs to take a number $n$ of test points satisfying the inequality $2 \Delta / \epsilon \leq F _ {n + 2 }$. Taking into account accuracy corrections to the determination of the values of the function, the above estimate becomes somewhat more complicated.

The limit

$$\tau = \ \lim\limits _ {n \rightarrow \infty } \ \frac{F _ {n} }{F _ {n + 1 } } = \ { \frac{\sqrt 5 - 1 }{2} } = \ 0.618 033 989 \dots$$

exists and gives the reason for introducing the golden section method — a version of the Fibonacci method such that for all $i$, $\Delta _ {i + 1 } = \tau \Delta _ {i}$, that is, the test points make a golden section of the current interval (cf. also Golden ratio). An advantage of this method consists in the fact that the number of test points does not have to be planned in advance.

Modifications of the Fibonacci method have been developed for functions of non-compact support, for shortening the calculations when $f ( x)$ is equal at two test points, for increasing the stability of the calculation, etc.

The Fibonacci method is significantly more efficient than that of dichotomy (see Dichotomy method). But for optimizing differentiable functions other methods are preferable (see Descent, method of; Maximization and minimization of functions).

#### References

 [1] J. Kiefer, "Sequential minimax search for a maximum" Proc. Amer. Math. Soc. , 4 (1953) pp. 502–506 [2] D.J. Wilde, "Optimum seeking methods" , Prentice-Hall (1964) [3] F.P. Vasil'ev, "Computational methods for solving extremum problems" , Moscow (1980) (In Russian)