question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Gaussian blur cpu performance

See original GitHub issue

I have been doing some experiments with PIX since it allows computing image augmentations in the GPU in contrast to torchvision which computes in the CPU and requires multiple workers to avoid bottlenecks. When performing some very simple timeit examples I observed a very high time when performing a gaussian blur in the CPU. I created a simple Colab notebook to demonstrate these experiments. I even tested transferring the image to CPU before performing the blur but it doesn’t seem to make any difference. I was wondering if this is intended and I should not rely on CPU computations at all or if something is yet to be optimized for CPU computation.

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:12

github_iconTop GitHub Comments

1reaction
claudiofantaccicommented, Dec 20, 2022

Hey @ASEM000, I have not forgotten about this 😄 I’ve been quite busy and should finally be back to normal work regime, I’ll try to look at all this asap 🚀

1reaction
claudiofantaccicommented, Nov 1, 2022

I’m finally back. I’ll try to look into this asap!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Fastest Gaussian Blur (in linear time) - Algorithms and Stuff
This 1D blur has the complexity O(n) (independent on r). But it is performed twice to get box blur, which is performed 3...
Read more >
tcoppex/cpu-gbfilter: Optimized Gaussian blur filter on CPU.
Optimized CPU Gaussian blur filter. Features : Cache efficient data access; Multithreading using OpenMP; Vectorization using SSE 4.1 intrinsics ...
Read more >
Fastest Gaussian blur implementation - java - Stack Overflow
All these above libraries will help in implementing Gaussian Blur faster than any implementation in Java on CPU. Share.
Read more >
An investigation of fast real-time GPU-based image blur ... - Intel
However, it turns out that there is a generic blur algorithm that can have even better performance than our optimized Gaussian solution ...
Read more >
Improve Real-Time GPU-Based Image Blur Algorithms - Intel
Blurring an image is a fairly trivial thing to do: just collect neighboring pixels, average them and you get your new value, right?...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found