Description
TL;DR
GA()
and DE()
don't move away from the initial point, say that any initial point is the optimum and report convergence, even though the algorithm isn't anywhere near the optimum.
I'm new to this, so this entire issue might be stupid, but I can't get the algorithms to work even with default settings, I can't even get started with the most basic things. Maybe the defaults could be adjusted somehow?
Basic example
Try to minimize f(x) = x^2
:
julia> Evolutionary.optimize(x->x[1]^2, [90.], GA())
* Status: success
* Candidate solution
Minimizer: [90.0]
Minimum: 8100.0
Iterations: 12
* Found with
Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]
* Convergence measures
|f(x) - f(x')| = 0.0 ≤ 1.0e-12
* Work counters
Seconds run: 0.0001 (vs limit Inf)
Iterations: 12
f(x) calls: 650
julia> Evolutionary.optimize(x->x[1]^2, [90.], DE())
* Status: success
* Candidate solution
Minimizer: [90.0]
Minimum: 8100.0
Iterations: 12
* Found with
Algorithm: DE/random/1/binxvr
* Convergence measures
|f(x) - f(x')| = 0.0 ≤ 1.0e-10
* Work counters
Seconds run: 0.0006 (vs limit Inf)
Iterations: 12
f(x) calls: 600
No, 90^2 = 8100
is most definitely not the minimum of x^2
.
I can fiddle with the optimizer's settings, but it still doesn't move from the initial point:
julia> Evolutionary.optimize(x->x[1]^2, [90.], DE(populationSize=1000)) |> Evolutionary.minimizer
1-element Vector{Float64}:
90.0
julia> Evolutionary.optimize(x->x[1]^2, [90.], DE(populationSize=1000, n=2)) |> Evolutionary.minimizer
1-element Vector{Float64}:
90.0
julia> Evolutionary.optimize(x->x[1]^2, [90.], DE(n=2)) |> Evolutionary.minimizer
1-element Vector{Float64}:
90.0
All of these runs also report convergence, but the output is too long.
I eventually got the genetic algo to work after specifying mutation=gaussian(), crossover=uniformbin()
(the defaults are apparently no-ops, but having no mutation and no crossover seems to defeat the purpose of having a genetic algorithm?):
julia> Evolutionary.optimize(x->x[1]^2, [90.], GA(mutation=gaussian(), crossover=uniformbin()))
* Status: success
* Candidate solution
Minimizer: [0.0006461356743545348]
Minimum: 4.1749130967358945e-7
Iterations: 267
* Found with
Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]
* Convergence measures
|f(x) - f(x')| = 0.0 ≤ 1.0e-12
* Work counters
Seconds run: 0.0033 (vs limit Inf)
Iterations: 267
f(x) calls: 13400
However, I couldn't get differential evolution to work, even for this simple function. Below are examples taken from the docs that don't seem to work either.
GA
example
This is taken from https://wildart.github.io/Evolutionary.jl/dev/tutorial/#Obtaining-results.
julia> Evolutionary.optimize(x->-sum(x), BitVector(zeros(3)), Evolutionary.GA())
* Status: success
* Candidate solution
Minimizer: [false, false, false]
Minimum: 0
Iterations: 11
* Found with
Algorithm: GA[P=50,x=0.8,μ=0.1,ɛ=0]
* Convergence measures
|f(x) - f(x')| = 0.0 ≤ 1.0e-12
* Work counters
Seconds run: 0.0066 (vs limit Inf)
Iterations: 11
f(x) calls: 600
So apparently, the minimum is f([0,0,0]) = 0
. However, I can get a lower value: f([1,1,1]) = -3
. In fact, GA
seems to accept any initial value as the solution:
julia> Evolutionary.optimize(x->-sum(x), BitVector([0,0,1]), Evolutionary.GA()) |> Evolutionary.minimizer
3-element BitVector:
0
0
1
julia> Evolutionary.optimize(x->-sum(x), BitVector([0,1,0]), Evolutionary.GA()) |> Evolutionary.minimizer
3-element BitVector:
0
1
0
julia> Evolutionary.optimize(x->-sum(x), BitVector([0,1,1]), Evolutionary.GA()) |> Evolutionary.minimizer
3-element BitVector:
0
1
1
DE
algorithm
I took the target function from this page: https://wildart.github.io/Evolutionary.jl/dev/constraints/.
julia> f(x)=(x[1]+2x[2]-7)^2+(2x[1]+x[2]-5)^2 # Booth
Then I use the genetic algorithm as shown on that page:
ga = GA(populationSize=100,selection=uniformranking(3),
mutation=gaussian(),crossover=uniformbin())
Evolutionary.optimize(f, [0., 0.], ga) |> Evolutionary.minimizer
If I run this multiple times (GA
seems to be randomized, so I wanted to draw more samples), I get about [1, 3]
on average. So far so good.
Now try the default DE()
with the same function and the same starting point [0., 0.]
:
julia> Evolutionary.optimize(f, [0., 0.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
0.0
0.0
julia> Evolutionary.optimize(f, [0., 0.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
0.0
0.0
julia> Evolutionary.optimize(f, [0., 0.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
0.0
0.0
Change the starting point:
julia> Evolutionary.optimize(f, [0., 100.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
0.0
100.0
julia> Evolutionary.optimize(f, [90., 5.], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
90.0
5.0
julia> Evolutionary.optimize(f, [90., 5.56375], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
90.0
5.56375
julia> Evolutionary.optimize(f, [π, 5.56375], DE()) |> Evolutionary.minimizer
2-element Vector{Float64}:
3.141592653589793
5.56375
DE
doesn't seem to care and says that the starting point is the optimum, whatever the starting point is.
GA
again
Now try the same function as above with the default GA
:
julia> Evolutionary.optimize(f, [90., 5.], GA()) |> Evolutionary.minimizer
2-element Vector{Float64}:
90.0
5.0
julia> Evolutionary.optimize(f, [90., π], GA()) |> Evolutionary.minimizer
2-element Vector{Float64}:
90.0
3.141592653589793
Same as the default DE
: it thinks that the initial point is the optimum, whatever the initial point is.