4.6. pyopus.optimizer.nm
— Unconstrained Melder-Mead simplex optimizer¶
Unconstrained Nelder-Mead optimizer (PyOPUS subsystem name: NMOPT)
A very popular unconstrained optimization algorithm first published in [nm],
Unfortunately no convergence theory is available. There is even a counterexample available showing how the algorithm can fail. See [mck].
[nm] | Nelder, J. A.,Mead, R.: A simplex method for function minimization. Computer Journal, vol. 7, pp. 308-313, 1965. |
[mck] | McKinnon, K. I. M.: Convergence of the Nelder-Mead Simplex Method to a Nonstationary Point. SIAM Journal on Optimization, vol. 9, pp. 148-158, 1998. |
-
class
pyopus.optimizer.nm.
NelderMead
(function, debug=0, fstop=None, maxiter=None, reflect=1.0, expand=2.0, outerContract=0.5, innerContract=-0.5, shrink=0.5, reltol=1e-15, ftol=1e-15, xtol=1e-09, simplex=None, looseContraction=False)¶ Nelder-Mead optimizer class
reflect, expand, outerContract, innerContract, and shrink are step size factors for the reflection, expansion, outer contraction, inner contraction, and shrink step, respectively.
expansion must be above 1. reflection must be greater than 0 and smaller than expansion. outerContraction must be between 0 and 1, while innerContraction must be between -1 and 0. shrink must be between 0 and 1.
reltol is the relative stopping tolerance. ftol and xtol are the absolute stopping tolerances for cost function values at simplex points and simlex side lengths. See the
checkFtol()
andcheckXtol()
methods.simplex is the initial simplex given by a (ndim*+1) times *ndim array where every row corresponds to one simplex point. If simplex is not given an initial simplex is constructed around the initial point *x0. See the
buildSimplex()
method for details.If looseContraction is
True
the acceptance condition for contraction steps requres that the new point is not worse than the worst point. This is the behavior of the original algorithm. If this parameter isFalse
(which is also the default) the new point is accepted if it is better than the worst point.See the
Optimizer
class for more information.-
buildSimplex
(x0, rel=0.05, abs=0.00025)¶ Builds an initial simplex around point given by a 1-dimensional array x0. rel and abs are used for the relative and absolute simplex size.
The initial simplex has its first point positioned at x0. The i-th point among the remaining ndim points is obtained by movin along the i-th coordinate direction by or abs if is zero.
-
check
()¶ Checks the optimization algorithm’s settings and raises an exception if something is wrong.
-
checkFtol
()¶ Checks the function value tolerance and returns
True
if the function values are within of the point with the lowest cost function value ().
-
checkXtol
()¶ Returns
True
if the components of points in the simplex are within of the corresponding components of the point with the lowest cost function value ().
-
orderSimplex
()¶ Reorders the points and the corresponding cost function values of the simplex in such way that the point with the lowest cost function value is the first point in the simplex.
-
reset
(x0)¶ Puts the optimizer in its initial state and sets the initial point to be the 1-dimensional array x0. The length of the array becomes the dimension of the optimization problem (
ndim
member).The initial simplex is built around x0 by calling the
buildSimplex()
method with default values for the rel and abs arguments.If x0 is a 2-dimensional array of size (ndim*+1) times *ndim it specifies the initial simplex.
-
run
()¶ Runs the optimization algorithm.
-
Example file nm.py in folder demo/optimizer/
# Optimize MGH unconstrained test suite with Nelder-Mead optimizer.
from pyopus.optimizer.nm import NelderMead
from pyopus.optimizer.base import Reporter
from pyopus.problems.mgh import *
from numpy import array, sqrt
from platform import system
if system()=='Windows':
# clock() is the most precise timer in Windows
from time import clock as timer
else:
# time() is the most precise timer in Linux
from time import time as timer
from sys import stdout
# Custom reporter that prints dots
class MyReporter(Reporter):
def __init__(self, name, iterSpacing=1, concise=False, printImprovement=True):
Reporter.__init__(self)
self.name=name
self.iterSpacing=iterSpacing
self.concise=concise
self.printImprovement=printImprovement
self.fbest=None
self.lastPrintout=None
def reset(self):
fbest=None
def __call__(self, x, f, opt):
report=False
if self.fbest is None:
self.lastPrintout=opt.niter
self.fbest=f
report=True
if self.fbest>f and self.printImprovement:
self.fbest=f
report=True
if opt.niter-self.lastPrintout>=self.iterSpacing:
report=True
if report:
if self.concise:
stdout.write(".")
stdout.flush()
else:
print("%30s (%2d) iter=%-6d f=%12.4e fbest=%12.4e" % (
self.name[:30], x.size, opt.niter, f, self.fbest
))
self.lastPrintout=opt.niter
if __name__=='__main__':
suite=[
[ Rosenbrock() ],
[ FreudensteinAndRoth() ],
[ PowellBadlyScaled() ],
[ BrownBadlyScaled() ],
[ Beale() ],
[ JennrichAndSampson() ],
[ McKinnon() ],
[ McKinnon(), array([[0.0, 0.0], [1.0, 1.0], [(1.0+sqrt(33.0))/8.0, (1.0-sqrt(33.0))/8.0]]) ],
[ HelicalValley() ],
[ Bard() ],
[ Gaussian() ],
[ Meyer() ],
[ GulfResearchAndDevelopement() ],
[ Box3D() ],
[ PowellSingular() ],
[ Wood() ],
[ KowalikAndOsborne() ],
[ BrownAndDennis() ],
[ Quadratic(4) ],
[ PenaltyI(4) ],
[ PenaltyII(4) ],
[ Osborne1() ],
[ BrownAlmostLinear(5) ],
[ BiggsEXP6() ],
[ ExtendedRosenbrock(6) ],
[ BrownAlmostLinear(7) ],
[ Quadratic(8) ],
[ ExtendedRosenbrock(8) ],
[ VariablyDimensioned(8) ],
[ ExtendedPowellSingular(8) ],
[ Watson() ],
[ ExtendedRosenbrock(10) ],
[ PenaltyI(10) ],
[ PenaltyII(10) ],
[ Trigonometric() ],
[ Osborne2() ],
[ ExtendedPowellSingular(12) ],
[ Quadratic(16) ],
[ Quadratic(24) ],
]
t1=timer()
results=[]
for probdef in suite:
# Take problem function.
prob=probdef[0]
# Write a message.
print("\nProcessing: "+prob.name+" ("+str(prob.n)+") ")
# Create optimizer object.
opt=NelderMead(prob.f, maxiter=100000, reltol=1e-16, ftol=1e-16, xtol=1e-9)
# Install custom reporter plugin. Print dot for 1000 evaluations.
opt.installPlugin(MyReporter(prob.name, 1000, concise=True, printImprovement=False))
# If problem has a custom initial simplex, set it.
if len(probdef)==1:
opt.reset(prob.initial)
else:
# Custom simplex (for McKinnon)
opt.reset(probdef[1])
# Start timing, run, measure time.
dt=timer()
opt.run()
dt=timer()-dt
# Write number of function evaluations.
print(" %d evaluations" % opt.niter)
# Calculate initial and final gradient
gini=prob.g(prob.initial)
gend=prob.g(opt.x)
# Store results for this problem.
result={
'i': opt.niter,
'x': opt.x,
'f': opt.f,
'gi': sqrt((gini**2).sum()),
'ge': sqrt((gend**2).sum()),
't': dt
}
results.append(result)
# Print summary. Last column is iterations per second.
print("\n")
for i in range(0, len(suite)):
prob=suite[i][0]
print("%30s (%2d) : ni=%6d f=%16.8e gi=%11.3e ge=%11.3e it/s=%11.3e" % (
prob.name[:30],
prob.initial.size,
results[i]['i'],
results[i]['f'],
results[i]['gi'],
results[i]['ge'],
results[i]['i']/results[i]['t']
)
)
t2=timer()
print("\ntime=%f" % (t2-t1))