question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add SPSA optimization method

See original GitHub issue

Feature details

The Simultaneous Perturbation Stochastic Approximation (SPSA) optimisation method is a faster optimisation method.

If the number of terms being optimized is p, then the finite-difference method takes 2p measurements of the objective function at each iteration (to form one gradient approximation), while SPSA takes only two measurements

It is also naturally suited for noisy measurements. Thus, it will be useful when simulating noisy systems.

The theory (and implementation) for SPSA is:

Furthermore, it is implemented:

Implementation and acceptance criteria

1. Implementation

There exists a qml.SPSAOptimizer optimizer that:

2. Testing

Test the SPSAOptimizer class similar to how GradientDescentOptimizer and QNGOptimizer are being tested.

3. Documentation

The documentation of the optimizer should be similar in fashion to the documentation of the QNGOptimizer.

The docstring of qml.SPSAOptimizer should:

  • Introduce the concept of SPSA and the task it solves;
  • Include and explain equations (5) and (6) from “An Overview of the Simultaneous Perturbation Method for Efficient Optimization” (page 9 of the PDF that is page 490 of the volume);
  • Mention the number of quantum device executions required with SPSA;
  • Mention the use cases for SPSA as the optimization method of a hybrid quantum-classical workflow;
  • Add examples for showcasing the use of the optimizer as for QNGOptimizer.

How important would you say this feature is?

2: Somewhat important. Needed this quarter.

Additional information

https://www.jhuapl.edu/SPSA/

Issue Analytics

  • State:closed
  • Created a year ago
  • Reactions:2
  • Comments:8 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
antalszavacommented, Apr 18, 2022

Hi @ankit27kh, thank you so much for this feedback! In fact, we’ve been considering adding a dedicated SPSA optimizer to PennyLane. Good to hear that there would be value in such an addition. 👍

1reaction
ankit27khcommented, Apr 18, 2022

What I like about PennyLane optimisers is the ability to have granular control over the iterations. This is usually missing from other libraries like qiskit and scipy. Sometimes you can use callback functions, but it’s not the same. I hope we can have SPSA in PennyLane so that we still enjoy a high degree of control over the optimisation process.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Optimization using SPSA — PennyLane documentation
SPSA is an optimization method that involves approximating the gradient of the cost function at each iteration step. This technique requires ...
Read more >
SPSA - qiskit.algorithms.optimizers
As an optimization method, it is appropriately suited to large-scale population models, adaptive modeling, and simulation optimization.
Read more >
AN IMPROVED SPSA ALGORITHM FOR STOCHASTIC ...
Abstract. We show that the Simultaneous Perturbation Stochastic Approximation (SPSA) algorithm with projection may exhibit slow convergence in constrained ...
Read more >
SPSA Algorithm
In summary, SPSA is a powerful method for optimization in challenging nonlinear problems. It has a strong theoretical foundation and is often more...
Read more >
Improved SPSA optimization algorithm requiring a single ...
Abstract: The simultaneous perturbation stochastic approximation (SPSA) is a simple and effective optimization algorithm. It requires only two measurements ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found