Optimize raises "ValueError: `x0` violates bound constraints" for parameters that are within bounds
See original GitHub issueMy related Stack Overflow question, unanswered
Hi, I have been running into a recurring issue for a while where the scipy optimize function throws a ValueError during fitting, claiming that x0 violates the bound constraints, despite them being valid and within bounds at the time optimize is called. It has been suggested to me that this error is possibly being raised due to x0 being changed during the calculation of the Jacobian, but I am not positive about the behind-the-scenes working of optimize.
My current workaround is going in and toggling the bounds a bit for the k
parameter, but this has to be done manually each time the code crashes until the full dataset is imported, and likely leading to variations and inconsistencies in fits that I would like to avoid. It seems that in many cases the “ideal” fit for the data is one that involves a k
value higher than the bounds (since I am fitting a logistic function to data that sometimes includes sharp stair-steps, leading to an infinitely high k
value), but I am forced to limit this to ensure reasonable convergence times. I believe that the optimize function might be trying to update the parameter value to one outside the bounds.
It appears that this error is OS or environment specific, as other Stack Overflow users were not encountering the error while running the reproduction example provided below, whereas it crashes and throws the given error for me 100% of the time. I am running on 64-bit Windows 10 Pro (Version 10.0.18363 Build 18363) and Python 3.6.6
Reproducing code example:
import numpy as np
import scipy as sp
from scipy.special import expit, logit
import scipy.optimize
def f(x,x0,g,c,k):
y = c*expit(k*10.*(x-x0)) + g*(1.-c)
return y
# x0 g c k
p0 = np.array([8.841357069490852e-01, 4.492363462957287e-19, 5.547073496706608e-01, 7.435378446218519e+00])
bounds = np.array([[-1.,1.], [0.,1.], [0.,1.], [0.,20.]])
x = np.array([1.0, 1.0, 1.0, 1.0, 1.0, 0.8911796599834791, 1.0, 1.0, 1.0, 0.33232919909076103, 1.0])
y = np.array([0.999, 0.999, 0.999, 0.999, 0.999, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001])
s = np.array([0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9, 0.9])
print([pval >= b[0] and pval <= b[1] for pval,b in zip(p0,bounds)])
fit,cov = sp.optimize.curve_fit(f,x,y,p0=p0,sigma=s,bounds=([b[0] for b in bounds],[b[1] for b in bounds]),method='dogbox',tr_solver='exact')
print(fit)
print(cov)
Error message:
c:\Users\user\Documents\LogRegProj\bin>python optimize_error.py
[True, True, True, True]
Traceback (most recent call last):
File "optimize_error.py", line 19, in <module>
fit,cov = sp.optimize.curve_fit(f,x,y,p0=p0,sigma=s,bounds=([b[0] for b in bounds],[b[1] for b in bounds]),method='dogbox',tr_solver='exact')
File "C:\Users\user\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\minpack.py", line 775, in curve_fit
**kwargs)
File "C:\Users\user\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\_lsq\least_squares.py", line 928, in least_squares
tr_solver, tr_options, verbose)
File "C:\Users\user\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\_lsq\dogbox.py", line 310, in dogbox
J = jac(x, f)
File "C:\Users\user\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\_lsq\least_squares.py", line 875, in jac_wrapped
kwargs=kwargs, sparsity=jac_sparsity)
File "C:\Users\user\AppData\Local\Programs\Python\Python36\lib\site-packages\scipy\optimize\_numdiff.py", line 362, in approx_derivative
raise ValueError("`x0` violates bound constraints.")
ValueError: `x0` violates bound constraints.
Scipy/Numpy/Python version information:
scipy: 1.4.1 numpy: 1.18.1 sys.version_info(major=3, minor=6, micro=6, releaselevel=‘final’, serial=0)
Thanks in advance, as this error has been driving me in circles for months.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:1
- Comments:31 (17 by maintainers)
After upgrading to 1.5.0, I’m experiencing a similar behavior with the SLSQP minimizer. When I reverse the upgrade back to 1.4.1 the issue disappears.
Version info: Python 3.7.7 Numpy 1.18.5 scipy - various versions mentioned Microsoft Windows 10 Enterprise 10.0.18362 Build 18362
With scipy 1.5.1 and Python 3.8.3 on Linux, I am experiencing a similar issue with the SLSQP minimizer as well.
For example, in this case, my initial solution is:
[-10.8623 -22.7164 11.3582 -22.7164]
The bounds are:Bounds(array([-260.3 , -260.3 , -123.159, -104.635]), array([ 12.35, -12.35, 12.35, -12.35]), keep_feasible=True)
I don’t get the error with scipy 1.4.1