In order to play with the configuration settings of Dakota’s
moga optimizer, you may use a “sandbox” demonstration optimization case in which the results are quickly obtained and easily visualized.
Attached is a demonstration optimization case. A two-objective, three-parameter optimization with external evaluation (in Python) is configured and extensively commented. Bash and Python scripts post-process the output of the simulation and display what is happening (the performance of individuals in each generation, and the size of the corresponding populations).
Playing with the main settings of the mutation and crossover operators, and watching the results on the performance and population size graphs, is a great way to learn how they work.