Extreme memory usage in GPUInterop
See original GitHub issueDescribe the bug I’ve been using the GPUInterop sample to build my own interop layer for my D3D11 graphics library and Avalonia. And I actually got it working! So… yay!
However, when it ran for a few seconds, the memory usage became insane. Just consuming a ton of RAM. So, to ensure that I didn’t do anything wrong I ran the GPUInterop sample (with DISCO turned on) and waited for like 5-10 seconds. That’s when I saw this: As you can see, the memory usage is crazy.
In my own code, I was able to stop the memory leakage when I commented out this line of code:
LastPresentation = _surface.UpdateWithKeyedMutexAsync(_imported, 1, 0);
To Reproduce Run the GPUInterop sample for several seconds and turn the DISCO slider up.
Expected behavior Stable memory usage.
Desktop (please complete the following information):
- OS: Windows 11
- GPU: Geforce 3090 RTX (Driver 527.56)
- Avalonia Version: Latest from master (pulled prior to running the GPUInterop sample).
Issue Analytics
- State:
- Created 7 months ago
- Comments:14 (10 by maintainers)
Top GitHub Comments
@Tape-Worm possible fix should be in the master as for now https://github.com/AvaloniaUI/Avalonia/pull/11643
So, I pulled from the avalon-all nuget packages (the nightly masters haven’t been updated since June 5th), and everything looks great. I’m able to get my stuff rendering with no memory issues.
Have a look: