question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Performance Profile Log

See original GitHub issue

In attempting how best to tackle testing for #500, I think the best solution would be to add and maintain a performance profile log with the repo, similar to a coverage report.

Tests could be written with timeit, cProfile, memory_profile, etc., and perhaps we could write something like a pytest fixture that would generate an instaviz report, or similar? I’m not sure how to implement yet (perhaps I would need to create a separate PyPI package or pytest plug-in the project would use?). Please let me know your thoughts.

I think a log makes the most sense because:

  1. The alternative (in my mind) would be maintaining old copies of code to test against to demonstrate PRs introduce performance improvements, which seems silly and unmaintainable, and

  2. It may be desirable in the future to add a PR that reduces performance to some degree for a big benefit in readability or usability, so rather than force future PRs to meet or exceed a certain level of performance, we could reference the log and use it as a guide (similar to how coverage reports are used as a guide, but aren’t gospel).

What are your thoughts?

@akaszynski @MatthewFlamm @banesullivan I’m unaware of any such python packages or plug-ins that would implement this. If you have any suggestions, that would be greatly appreciated! I think this could be a great addition!


Goals:

(@pyvisa/developers : Please update/correct as needed)

  1. Verifying specific Planned PR’s are performance improvements

    • Create workflow for marking PR as performance improvement and require testing against pyvista:main in pyvista/pyvista-benchmarks
    • Add to CONTRIBUTING.md documentation how to perform performance improvement test and reference in PR for reviewers
  2. Keep a rough idea of library performance and catch unexpected regressions

    • Create pyvista/pyvista-benchmarks repo
    • Add benchmark scripts
    • Identify version of VTK to test against
    • Identify hardware to use for test isolation and consistency

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:17 (17 by maintainers)

github_iconTop GitHub Comments

4reactions
prisaecommented, Jul 14, 2022

Just as an idea: I personally like to use https://github.com/airspeed-velocity/asv for benchmarking - it is the benchmark-suite used, e.g., by NumPy and Scipy (an example for SciPy from Pauli Virtanen can be seen at https://pv.github.io/scipy-bench/).

1reaction
adam-grant-hendrycommented, Apr 16, 2022

For this we could have a set of benchmark scripts in the repo, which can be run both in the PR’s tip and in the base (hopefully pyvista:main).

That is brilliant @adeak , I wish I thought of that. Doing this plus the fact that

We already have the complete history of the library in git 😃

would solve the first problem.

The second goal is to keep a rough idea about the performance of the library, and to catch unexpected regressions. Most PRs are not performance-related, in fact they aren’t expected to affect performance in any way. This is why I wouldn’t want to have benchmarking in CI…I think it should be enough if we catch severe regressions every once in a while. (Benchmark running every week?)

Completely agreed. We could get away with doing performance regression testing on a limited, but regular basis.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Finding performance problems: profiling or logging?
What's the fastest way to find performance problems in production? To jump ahead to our conclusion: trace-based logging and profiling actually ...
Read more >
How is profiling different from logging? - Stack Overflow
Profiling can be done by logging performance metrics, or it can be done by using specialized tools or utilities to examine the state...
Read more >
Performance profiling - IBM
This feature enables you to trace timestamps using the independent trace facility giving a detailed performance profile. Parent topic: Utilities for logging.
Read more >
Chapter 21. Recording and analyzing performance profiles ...
Recording a performance profile without root access. You can use perf record without root access to sample and record performance data in the...
Read more >
Record a Chrome Performance Profile – Figma Help Center
The Console log in Chrome's developer tools records any actions the browser has performed. If any of these actions fail, they will show...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found