The majority of stochastic optimization algorithms can be writ-
ten in the general form $x_{t+1}= T_{t}\left(x_{t},y_{t}\right)$, where $x_{t}$ is a sequence of points and parameters which are transformed by the algorithm, $T^{_{t}}$ are the methods of the algorithm and $Y_{t}$ represent the randomness of the algorithm. We extend the results of papers [11] and [14] to provide some new general conditions under which the algorithm finds a global minimum with probability one.
keywords in English:
stochastic optimization, global optimization, Lyapunov function, weak convergence of measures
affiliation:
Wydział Matematyki i Informatyki : Instytut Matematyki