4.9. pyopus.optimizer.psade — Box constrained parallel SADE global optimizer

Inheritance diagram of pyopus.optimizer.psade

Box constrained parallel SADE global optimizer (PyOPUS subsystem name: PSADEOPT)

SADE stands for Simulated Annealing with Differential Evolution.

A provably convergent (parallel) global optimization algorithm.

The algorithm was published in

Olenšek J., Tuma T., Puhan J., Buermen A.: A new asynchronous parallel global optimization meth od based on simulated annealing and differential evolution. Applied Soft Computing Journal, vol. 11, pp. 1481-1489, 2011.

class pyopus.optimizer.psade.MsgEvaluatePoint(x)
A message requesting the evaluation of a normalized point x.
class pyopus.optimizer.psade.MsgEvaluateGlobal(xip, xi1, delta1, delta2, R, w, px)

A message requesting the evaluation of a global search step.

xip is the parent normalized point, xi1 is the first of the two normalized points that competed for better T and R parameters. delta1 and delta2 are the two differential vectors used by the differential evolution operator. R is the range parameter used for generating the random step. w and px are the differential operator weight and the crossover probability.

class pyopus.optimizer.psade.MsgEvaluateLocal(xa, fa, delta)

A message requesting the evaluation of a local search step.

xa and fa are a normalized point and its corresponding cost function value. delta is the search direction.

class pyopus.optimizer.psade.MsgResult(x, f, annotations=None)

A message holding the result of the evaluation of a normalized point x.

f is the result of the cost function evaluation while annotations holds the corresponding annotations.

class pyopus.optimizer.psade.MsgGlobalResult(x, f, annotations=None)

A message holding the results of a global step.

x is the evaluated normalized point and f is the corresponding cost function value. annotations (if not None) holds the annotations corresponding to the evaluated point.

class pyopus.optimizer.psade.MsgLocalResult(x, f, annotations=None)
A message holding the results of a local step. x is a tuple holding the evaluated normalized points and f is a tuple of corresponding cost function values. annotations (if not None) is a tuple holding the annotations corresponding to the evaluated points. All three tuples must be of the same length.
class pyopus.optimizer.psade.ParallelSADE(function, xlo, xhi, debug=0, fstop=None, maxiter=None, vm=None, maxSlaves=None, minSlaves=0, populationSize=20, pLocal=0.01, Tmin=1e-10, Rmin=1e-10, Rmax=1.0, wmin=0.5, wmax=1.5, pxmin=0.10000000000000001, pxmax=0.90000000000000002)

Parallel SADE global optimizer class

If debug is above 1, the debug option of the EventDrivenMS class is set to 1.

The lower and upper bound (xlo and xhi) must both be finite.

populationSize is the number of inidividuals (points) in the population.

pLocal is the probability of performing a local step.

Tmin is the minimal temperature of the annealers.

Rmin and Rmax are the lower and upper bound on the range parameter of the annealers.

wmin, wmax, pxmin, and pxmax are the lower and upper bounds for the differential evolution’s weight and crossover probability parameters.

If a virtual machine object is passed in the vm argument the algorithm runs in parallel on the virtual machine represented by vm. The algorithm can also run locally (vm set to None).

The algorithm is capable of handling notification messages received when a slave fails or a new host joins the virtual machine. In the first case the work that was performed by the failed slave is reassigned. In the second case new slaves are spawned. The latter is performed by the handler in the EventDrivenMS class. Work is assigned to new slaves when the algorithm detects that they are idle.

All operations are performed on normalized points ([0,1] interval corresponding to the range defined by the bounds).

See the BoxConstrainedOptimizer and the EventDrivenMS classes for more information.

accept(xt, ft, ip, itR)

Decides if a normalized point xt should be accepted. ft is the corresponding cost function value. ip is the index of the best point in the population. itR is the index of the point (annealer) whose temperature is used in the Metropolis criterion.

Returns a tuple (accepted, bestReplaced) where accepted is True if the point should be accpeted and bestReplaced is True if accepting xt will replace the best point in the population.

check()
Checks the optimization algorithm’s settings and raises an exception if something is wrong.
contest(ip)
Performs a contest between two random points in the population for better values of the temperature and range parameter. The first point’s index is ip. The second point is chosen randomly.
fillHandlerTable()
Fills the handler table of the EventDrivenMS object with the handlers that take care of the PSADE’s messages.
generateTrial(xip, xi1, delta1, delta2, R, w, px)

Generates a normalized trial point for the global search step.

A mutated normalized point is generated as

xi1 + delta1*w*random1 + delta2*w*random2

where random1 and random2 are two random numbers from the [0,1] interval.

A component-wise crossover of the mutated point and xip is performed with the crossover probability px. Then every component of the resulting point is changed by a random value generated from the Cauchy probalility distribution with parameter R.

Finally the bounds are enforced by selecting a random value between xip and the violated bound for every component of the generated point that violates a bound.

Returns a normalized point.

generateTrialPrerequisites()

Generates all the prerequisites for the generation of a trial point.

Choosed 5 random normalized points (xi1..xi5) from the population.

Returns a tuple comprising the normalized point xi1, and two differential vectors xi2-xi3, xi4-xi5.

handleEvaluateGlobal(source, message)
Handles MsgEvaluateGlobal messages reeived by slaves when the algorithm is running. Performs a global step and returns a MsgGlobalResult message with the evaluated trial point.
handleEvaluateLocal(source, message)
Handles MsgEvaluateLocal points received by slaves when the algorithm is running. Performs a local step and returns a MsgLocalResult message with the evaluated points. If the local step is a failure, the points, cost function values, and annotations are all None.
handleEvaluatePoint(source, message)
Handles MsgEvaluatePoint messages received by slaves during the evaluation of the initial population. Evaluates the received point and sends back a MsgResult message.
handleGlobalResult(source, message)

Handles a MsgGlobalresult with the result of a global step. Takes care of accepting the point into teh population and decides whether a local step should be taken. In case a local step is needed it responds with a MsgEvaluateLocal message sent to the slave that was performing the global search.

If no local step is taken marks the slave as idle so that new work can be assigned to it by the MsgIdle message handler.

handleIdle(source, message)

Handles a MsgIdle message.

Distributes work to the slaves.

In the beginning the initial population is evaluated in parallel (if possible) by sending class:MsgEvaluatePoint messages to slaves.

After the initial population was evaluated it distributes MsgEvaluateGlobal messages to slaves.

handleLocalResult(source, message)
Handles a MsgLocalresult message with the results of a local step. Takes care of accepting the point in the population and marks the slave that sent the message as idle so that new work can be assigned to it by the MsgIdle message handler.
handleResult(source, message)

Handles a MsgResult message and takes care of the evaluation result. These messages are used only during the evaluation of the initial population.

Marks the slave that sent the message as idle so that new work can be assigned to it by the MsgIdle message handler.

handleSlaveStarted(source, message)
Handles the MsgSlaveStarted message received by the master from a freshly started slave. Does nothing special, just calls the inherited handler from the EventDrivenMS class.
handleStartup(source, message)

Handles a MsgStartup message received by master/slave when the event loop is entered.

Performs some initializations on the master and prevents the plugins in the slave from printing messages.

handleTaskExit(source, message)

Handles a notification message reporting that a slave has failed (is down).

Stores the work that was performed at the time the slave failed so it can later be reassigned to some other slave.

initialPopulation()
Constructs and returns the initial population.
initialTempRange()
Chooses the values of the range and temperature parameters for the annealers.
localStep(xa, fa, d)

Performs a local step starting at normalized point xa with the corresponding cost function value fa in direction d.

The local step is performed with the help of a quadratic model. Two or three additional points are evaluated.

The return value is a tuple of three tuples. The furst tuple lists the evaluated normalized point, the second one lists the corresponding cost function values and the third one the corresponding annotations. All three tuples must have the same size.

Returns None if something goes wrong (like a failure to move a point within bounds).

reset(x0=None)

Puts the optimizer in its initial state.

If the initial point x is a 1-dimensional array or list, it is ignored. It must, however, match the dimension of the bounds.

If it is a 2-dimensional array or list the first index is the initial population member index while the second index is the component index. The initial population must lie within bounds xlo and xhi and have populationSize members.

If x is None the initial population is generated automatically. See the initialPopulation() method.

run()
Runs the optimization algorithm.
selectControlParameters()

Selects the point (annealer) whose range, temperature, differential operator weight and crossover probability will be used in the global step.

Returns the index of the point.

Previous topic

4.8. pyopus.optimizer.de — Box constrained differential evolution optimizer

Next topic

4.10. pyopus.optimizer.mgh — More-Garbow-Hillstrom set of test functions

This Page