Reproducibility of MOOP by setting the seed of random number generators
See original GitHub issueHello everyone,
I have a question concerning multi objective optimization using the jMetal Framework. I have used the “NSGAIITSPRunner.java” (from the package org.uma.jmetal.runner.multiobjective and jMetal 5.1) and for adding reproducibility I added the line
JMetalRandom.getInstance().setSeed(10L);
at the beginning of the main-method. So the main-method looks like this:
public static void main(String[] args) throws IOException {
JMetalRandom.getInstance().setSeed(10L);
PermutationProblem<PermutationSolution<Integer>> problem;
Algorithm<List<PermutationSolution<Integer>>> algorithm;
CrossoverOperator<PermutationSolution<Integer>> crossover;
MutationOperator<PermutationSolution<Integer>> mutation;
SelectionOperator<List<PermutationSolution<Integer>>, PermutationSolution<Integer>> selection;
problem = new MultiobjectiveTSP("/tspInstances/kroA100.tsp", "/tspInstances/kroB100.tsp");
[...]
Furthermore, I used the data (kroA100.tsp and kroB100.tsp) which was provided with the framework. I then ran the program once and my FUN.tsv started with the line
“93328.0 132453.0”.
After I ran the program another time the FUN.tsv started with the line
“145839.0 101293.0”.
However, I was wondering why this is happening. I thought that if I set the seed to the same value both times the results would be the same, since the random values which are generated during selection/mutation/… are the same (and this worked for me for non-permutation problems like the ZD1). Can you help me clarify my misunderstanding or my error?
Kind regards
Issue Analytics
- State:
- Created 7 years ago
- Comments:41 (33 by maintainers)
Top GitHub Comments
It would be nice to have a general test which checks all the algorithms, and which passes only when reproducibility is confirmed everywhere (and fails by telling which algorithm is not reproducible). I opened a new issue for that.
You are right. Let me check why this happens.