Difference between revisions of "Optimization problem"

From BioUML platform
Jump to: navigation, search
Line 14: Line 14:
 
The problem could be solved by different optimization methods. We implemented the following of them in the BioUML software:
 
The problem could be solved by different optimization methods. We implemented the following of them in the BioUML software:
  
<li>stochastic ranking evolution strategy (SRES) [1];
+
<ul>
 +
<li>stochastic ranking evolution strategy (SRES) [1];</li>
 
<li>cellular genetic algorithm MOCell [2];
 
<li>cellular genetic algorithm MOCell [2];
 
<li>particle swarm optimization (PSO) [3];
 
<li>particle swarm optimization (PSO) [3];
 
<li>deterministic method of global optimization glbSolve [4];
 
<li>deterministic method of global optimization glbSolve [4];
 
<li>adaptive simulated annealing (ASA) [5].
 
<li>adaptive simulated annealing (ASA) [5].
 +
</ul>
  
  

Revision as of 16:45, 12 March 2019

The general nonlinear optimization problem [1] can be formulated as follows: find a minimum of the objective function ϕ(x), where x lies in the intersection of the N-dimensional search space

Optimization formula 1.png

and the admissible region ℱ ⊆ ℝN defined by a set of equality and/or inequality constraints on x. Since the equality gs(x) = 0 can be replaced by two inequalities gs(x) ≤ 0 and –gs(x) ≤ 0, the admissible region can be defined without loss of generality as

Optimization formula 2.png

In order to get solution situated inside ℱ, we minimize the penalty function

Optimization formula 3.png

The problem could be solved by different optimization methods. We implemented the following of them in the BioUML software:

  • stochastic ranking evolution strategy (SRES) [1];
  • cellular genetic algorithm MOCell [2];
  • particle swarm optimization (PSO) [3];
  • deterministic method of global optimization glbSolve [4];
  • adaptive simulated annealing (ASA) [5].


The below table shows the generic scheme of the optimization process for these methods. SRES, MOCell, PSO and glbSolve run a predefined number of iterations Nit considering a sequence of sets (populations) Pi, i = 0,…,Nit − 1, of potential solutions (guesses). In the case of the first three methods, the size s ∈ ℕ+ of the population is fixed, whereas in glbSolve the initial population P0 consists of one guess, while the size sk+ 1 of the population Pk+1 is found during the iteration with the number k = 0,…, Nit − 1. The method ASA considers sequentially generated guesses xk ∈ Ω, k ∈ ℕ+, and stops if distance between xk and xk+1 defined as Euclidean norm

Optimization formula 4.png

becomes less than a predefined accuracy ε.

References

  1. Runarsson T.P., Yao X. Stochastic ranking for constrained evolutionary optimization. IEEE Transactions on Evolutionary Computation. 2000. 4(3):284–294.
  2. Nebro A.J., Durillo J.J., Luna F., Dorronsoro B., Alba E. MOCell: A cellular genetic algorithm for multiobjective optimization. International Journal of Intelligent Systems. 2009. 24(7):726–746.
  3. Sierra M.R., Coello C.A. Improving PSO-Based Multi-objective Optimization Using Crowding, Mutation and ∈-Dominance. Evolutionary Multi-Criterion Optimization. Lecture Notes in Computer Scienc. 2005. 3410:505-519.
  4. Björkman M., Holmström K. Global Optimization Using the DIRECT Algorithm in Matlab. Advanced Modeling and Optimization. 1999. 1(2):17–37.
  5. Ingber L. Adaptive simulated annealing (ASA): Lessons learned. Control and Cybernetics. 1996. 25(1):33–54.
Personal tools
Namespaces

Variants
Actions
BioUML platform
Community
Modelling
Analysis & Workflows
Collaborative research
Development
Virtual biology
Wiki
Toolbox