question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

optimize.shgo gives unexpected TypeError

See original GitHub issue

My issue is about optimize.shgo. It gives an unexpected TypeError.

When running the following

from scipy.optimize import rosen, rosen_der, rosen_hess
bounds = [(0,1.6), (0, 1.6), (0, 1.4), (0, 1.4), (0, 1.4)]
result = scipy.optimize.shgo(rosen, bounds, options={'jac':rosen_der,'hess':rosen_hess})

I get

TypeError: _minimize_slsqp() got multiple values for argument 'jac'

I believe that jac is correctly specified here (see this documentation). Did I make a mistake or is there a bug here?

Scipy/Numpy/Python version information:

1.6.0 1.20.0 sys.version_info(major=3, minor=8, micro=5, releaselevel='final', serial=0)

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
mdhabercommented, Nov 25, 2022

@Stefan-Endres I believe this was a duplicate of gh-12963 (different from above). This appears to have been fixed by gh-17140.

0reactions
Stefan-Endrescommented, Aug 5, 2021

There is a sizable basin of attraction around the expected position of the “interesting” minimum, but there is a another “uninteresting” minimum (potentially smaller, i.e. the global minimum, but it seems there are settings to explore all candidates for local minima using shgo?).

Yes, by default shgo will explore all these minima. Once it has found the local minimum in a basin of attraction it will not search locally in that basin again (there is a theoretical advantage that it will not waste additional local searches in every subdomain that is a convex basin of attraction). If you keep iterating then it will explore all the local minima. Refining the complex globally has the guarantee that shgo will explore all the local minima until all are found (iff the number of local minima are finite and the minima are strictly convex).

The fact that the exact shape of the basin of attraction isn’t know makes it difficult to get attracted to the minimum of interest without stepping outside it. I managed in low 2 and 3 dimensional cases by choosing appropriate bounds and/or setting an appropriate initial step size by hand but this is a less than ideal solution. (The hessian does not provide a good scale/initial step size because it starts in an non-convex region.)

This is also one of the reasons shgo was developed to supply bounds/constraints for every local basin of attraction it detects, it then constructs and passes the local constraints of that basin (based on the geometry known from the global sampling data in the current iteration) and keeps the local solver in the local basin of attraction (ex. in solutions thermodynamics this is important to find all phases in equilibrium). However, this of course only works in the limit where enough global exploration has been performed (by iterating more sampling points).

(I am looking to solve problems with 3-10 parameters.)

This is perfect for shgo.

This actually made me interested in what the option f_min exactly does? Would it be useful to me?

Yes, when you supply f_min the global algorithm will stop when a solution was found within a given f_tol of f_min, but if you would like to explore all minima with an objective function value around -0.001 or possibly lower value it would be better to not supply this f_min option.

Anyway, thanks for the help. And sorry for boring you if you read all this 😉. You seem really busy but if you do happen to have time to chat more I would be very interested. I’m a theoretical physicist that just happen to stumble on a problem that requires a tough optimization, so if I could talk to an expert like you I would be very grateful.

I would be very happy to talk more about your problem and help you further if needed. Unfortunately, I do not think GitHub is the appropriate forum to discuss use cases, but please feel free to contact me by e-mail (either work/personal e-mail are both fine). Alternatively, we can post on the scipy-user mailing lists (which I believe is also appropriate), especially if you think the discussion could have value staying in the public domain.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Possible bug in SciPy shgo (Unexpected TypeError)
When running the following from scipy.optimize import rosen, rosen_der, rosen_hess bounds = ...
Read more >
shgo is not correctly passing jac to minimizer #12963 - GitHub
Is this a bug in shgo or am I doing something wrong? Please advise if there is a ... optimize.shgo gives unexpected TypeError...
Read more >
scipy.optimize.shgo — SciPy v1.9.3 Manual
Finds the global minimum of a function using SHG optimization. SHGO stands for “simplicial homology global optimization”. Parameters. funccallable.
Read more >
scipy/optimize/_shgo.py - Fossies
As a special service "Fossies" has tried to format the requested source page into HTML format using (guessed) Python source code syntax highlighting...
Read more >
SciPy 1.10.0 Release Notes
The new scipy.ndimage.value_indices function provides a time-efficient method to search for the ... #14533: optimize.shgo gives unexpected TypeError.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found