An Interesting Math Problem¶

To illustrate how pyswarm is to be best utilized, we’ll start with a complete example, which will be explained step-by-step afterwards:

from pyswarm import pso def banana ( x ): x1 = x [ 0 ] x2 = x [ 1 ] return x1 ** 4 - 2 * x2 * x1 ** 2 + x2 ** 2 + x1 ** 2 - 2 * x1 + 5 def con ( x ): x1 = x [ 0 ] x2 = x [ 1 ] return [ - ( x1 + 0.25 ) ** 2 + 0.75 * x2 ] lb = [ - 3 , - 1 ] ub = [ 2 , 6 ] xopt , fopt = pso ( banana , lb , ub , f_ieqcons = con ) # Optimum should be around x=[0.5, 0.76] with banana(x)=4.5 and con(x)=0

Now let’s walk through each section of code. We start off with any necessary imports, in this case it’s just the optimizer function pso :

from pyswarm import pso

Then we define the objective function to be minimized, which should be defined like myfunction(x, *args, **kwargs) . In other words, it takes as its first argument an 1-d array-like object, followed by any other (optional) arguments and (again, optional) keyword arguments. The function should return a single scalar value that is minimized. In this example, the banana function:

def banana ( x ): x1 = x [ 0 ] x2 = x [ 1 ] return x1 ** 4 - 2 * x2 * x1 ** 2 + x2 ** 2 + x1 ** 2 - 2 * x1 + 5

Optimizing with constraints is optional, but we include one here to illustrate how it might be done in the con function which has the same call syntax as the objective, but returns an array of values (even if it only has a single value in it):

def con ( x ): x1 = x [ 0 ] x2 = x [ 1 ] return [ - ( x1 + 0.25 ) ** 2 + 0.75 * x2 ]

Rather than specify a starting point for the algorithm, we define the limits of the input variables that the optimizer is allowed to search within. For the sake of clarity, we have defined them prior to calling the optimizer in the objects lb and ub , which stand for lower-bound and upper-bound, respectively:

lb = [ - 3 , - 1 ] ub = [ 2 , 6 ]

That is really all that needs to be defined to run pso , so we then call the optimizer:

xopt , fopt = pso ( banana , lb , ub , f_ieqcons = con )

Using the kwarg f_ieqcons tells the routine that there’s a single constraint function that returns an array object.

Once complete, pso returns two objects: 1) the optimal input values and 2) the optimal objective value.

The full call syntax for pso is highly customizable and is defined as follows:

pso ( func , lb , ub , ieqcons = [], f_ieqcons = None , args = (), kwargs = {}, swarmsize = 100 , omega = 0.5 , phip = 0.5 , phig = 0.5 , maxiter = 100 , minstep = 1e-8 , minfunc = 1e-8 , debug = False )

where the minimum required input arguments are:

func : function The function to be minimized lb : array The lower bounds of the design variable(s) ub : array The upper bounds of the design variable(s)

and the optional input keyword-arguments are defined as:

ieqcons : list A list of functions of length n such that ieqcons[j](x,*args) >= 0.0 in a successfully optimized problem (Default: empty list, []) f_ieqcons : function Returns a 1-D array in which each element must be greater or equal to 0.0 in a successfully optimized problem. If f_ieqcons is specified, ieqcons is ignored (Default: None) args : tuple Additional arguments passed to objective and constraint functions (Default: empty tuple, ()) kwargs : dict Additional keyword arguments passed to objective and constraint functions (Default: empty dict, {}) swarmsize : int The number of particles in the swarm (Default: 100) omega : scalar Particle velocity scaling factor (Default: 0.5) phip : scalar Scaling factor to search away from the particle’s best known position (Default: 0.5) phig : scalar Scaling factor to search away from the swarm’s best known position (Default: 0.5) maxiter : int The maximum number of iterations for the swarm to search (Default: 100) minstep : scalar The minimum stepsize of swarm’s best position before the search terminates (Default: 1e-8) minfunc : scalar The minimum change of swarm’s best objective value before the search terminates (Default: 1e-8) debug : boolean If True, progress statements will be displayed every iteration (Default: False)

We could have written the constraint function to return a scalar value instead of an array-like object, like:

def con ( x ): x1 = x [ 0 ] x2 = x [ 1 ] return - ( x1 + 0.25 ) ** 2 + 0.75 * x2

In which case, we would have utilized the keyword-argument ieqcons , which takes an array of function handles, like:

xopt , fopt = pso ( banana , lb , ub , ieqcons = [ con ])

The parameters args and kwargs are used to pass any additional parameters to the objective and constraint functions and are not changed during the optimization process.

The parameters omega , phig and phip are a way of controlling how closely the particles move away from their own best known position and the best known position of all the particles in the swarm. These can take any scalar value, but values between 0 and 1 seem to work best.