Skip to content

Optimization accuracy is not returned for non-adaptive case #15

@hkayabilisim

Description

@hkayabilisim

When I use the non-adaptive case (see example below), the program does not display a measure to evaluate the "goodness" of the calculated optimum point. In the example below, rosenbrock_2d has a global optimum at [1,1] whereas the program returns [0,0]. However, in the output, there is no indication of optimization error. For instance, standard euclidean distance between the true and estimated optimum points can be provided.

$ python src/main.py --numSamples 100 --numVariables 2 --function rosenbrock_2d 

Args:  Namespace(numSamples=100, numVariables=2, function='rosenbrock_2d', min=None, max=None, x0=None, randomInit=False, basisFunction='Cosine', legendreDegree=7, adaptive=False, numClosestPoints=100, epsilon=0.1, clip=0.9, numberOfRuns=1)
is_adaptive: False
x1_min:  -1.9963089700142151
0.16399216651916504 seconds
hdmr_opt status: [ message:  
 success: True
     fun: [[ 1.000e+00]]
       x: [0.0, 0.0]
    nfev: 100
    njev: 0
    nhev: 0]

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions