question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cythonize scalar function root finders

See original GitHub issue

@person142 responsed to #8354 that his preference was for a cython optimize api.

This came up before: #7242.

As you can see I was against the idea. Handling all reasonable halting conditions for a scalar solver is already a PITA, and I think the problem gets much worse for a vectorized scalar solver. IMO it is better to provide a Cython API to a scalar solver and let users handle looping over it.

#8357 only vectorized Newton methods. #8431 allows users to use a “tight loop” in C with Brent, Ridder, and other root finders.

We decided it was okay to split up Newton

Is it okay to split up Newton, Halley’s, and secant into different calls in cython_optimize?

I am personally pro this (wish it had been done that way in optimize), but maybe others will feel more strongly that the APIs should match.

We discussed which callbacks to support and the fact that cython_optimize should be pure C so it can free the GIL in this comment

This commit has links to annotated html that shows that zeros_struct and zeros_array are pure C, no Python so they can release the GIL to call Cython prange

The Cython optimize API in #8431 was discussed in this SciPy-Dev post and this one too when I specifically asked about callback signatures.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:21 (21 by maintainers)

github_iconTop GitHub Comments

2reactions
rgommerscommented, Nov 8, 2018

@mikofski did you mean to close this issue?

After reading through the whole thing again, my summary is:

It’s still hard to weigh the pros and cons of each approach, given the multiple use cases. We should try and write a summary document together I think, because my head hurts …

2reactions
charriscommented, Sep 7, 2018

I would definitely use Cython brentq, the other scalar root-finders might be lower priority.

If I were doing this today, I would only implement brentq and bisect. Back when I wrote these in (2002?), I just threw in everything, just because.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Optimization and root finding (scipy.optimize)
Minimization of scalar function of one or more variables. The minimize function supports the following methods: minimize(method='Nelder-Mead') · minimize ...
Read more >
scipy.optimize.newton — SciPy v1.11.0.dev0+1150.6a55516 ...
newton is for finding roots of a scalar-valued functions of a single variable. For problems involving several variables, see root . Parameters: funccallable....
Read more >
optimize/__init__.py · alkaline-ml/scipy - Gemfury
Optimization and Root Finding (:mod:`scipy.optimize`) ... for nonlinear solvers of scalar functions. brentq - quadratic interpolation Brent method. brenth ...
Read more >
Scipy: Find root of function with vector input and scalar output
I do this by root finding. import numpy as np from scipy.stats import multivariate_normal from scipy.optimize import root ...
Read more >
Finding Roots of Equations - Duke People
In the case of a scalar-valued function on Rn, the first derivative is an n×1 vector called the gradient (denoted ∇f). The second...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found