XTREME TECHNOLOGY

BASIC PRINCIPLES

BASIC OPTIMIZATION BASED ON GENETIC ALGORITHMS ACCELERATED BY META-MODELS

Xtreme is the perfect tool to optimize your product or process. Xtreme contains optimization algorithms that use the most advanced genetic algorithms coupled to artificial neural networks. Moreover, the strenght of OPTIMAL COMPUTING technology comes from the unique and optimized coupling between the genetic algorithm and the neural network. Optimization With Apporximation

Our optimization technology is based on genetic algorithms coupled to artificial neural networks to drastically accelerate (x 100) the convergence to the optimal solution. The software is best suited for large scale optimization challenges usually found in aeronautic, automotive industry or the energy sector.

Here are the basic steps of the algorithm:

1. A design of experiment (DOE) is generated. This means that a number of design variable vectors are choosen following a random process or any other well know DOE. These design variable vectors are computed using the accurate simulation program. Once the analysis is performed the design variable values and the simulated performance are stored in a database (DB)

2. The approximate model (i.e. a neural network) is trained using the design variable values and performance values stored in the DB.

3. Then the INNER OPTIMIZATION loop is run using a genetic algorithm as optimization technique and using the approximate model (i.e the neural network) as simulation program. This process finishes with a proposed approximate optimal solution (a vector of design variables). This is "only" an approximate solution because the solution evaluation is performed using the approximate model and not the precise accurate simulation program.

4. Then the proposed optimal solution is simulated using the accurate simulation. This simulation usually takes from several seconds to a few minutes to even a few hours. Once this accurate simulation is finished, the design variable values and compute performance are stored into the DB.

5. Then another OUTER OPTIMIZATION loop is started. This process restart with the training of the neural network. This will lead to a more accurate approximate model as the DB now contains one more sample from the previous design loop.

This process mimics some learning mechanism. Indeed, the neural network will "learn" iteratively the true shape of the real accurate relation between the design variables and the simulated performance.

Optimization With Approximation

EXAMPLES FOR SINGLE-OBJECTIVE OPTIMIZATION

QUADRATIC 50D

50 design variables - 100 times faster than Genetic Algorithm

This example shows the performance of the FAST Genetic Algorithm on a simple quadratic test function with 50 design variables. A standard genetic algorithm (green curve on the left graph) requires much more than 10 000 function evaluations to approach the true optimum while the FAST GA reach full convergence in less than 100 function evaluations.

Quadratic 50D Fast GA Quadratic 50D Fast GA versus GA

KEANE

Hard multi-modal constrained problem

This example shows the performance of the FAST Genetic Algorithm on a highly multi-modal test function (The Keane's function). The optimization algorithm is run 5 times with different starting points and design of experiments. The convergence shows that Xtreme reaches the true optimum in all cases in less than 100 function evaluations.

Keane multi modal test function Keane FAST GA Convergence History

More examples

More examples of hard test function tested with Xtreme are available on demands. Please register to our web site and request more information are simply send us an e-mail to info@optimalcomputing.be. You can also simply download our evaluation license to test it yourself during 15 days.