Excessive memory consumption
See original GitHub issueThe network currently runs into out of memory issues at a low number of layers.
This seems to be a problem with TensorFlow’s atrous_conv2d
operation.
If I set the dilation factor to 1
, which means atrous_conv2d
simply calls conv2d
, I can easily run with 10s of layers.
It could just be the additional batch_to_space
and space_to_batch
operations, in which case I can write a single C++ op for atrous_conv2d
.
Issue Analytics
- State:
- Created 7 years ago
- Comments:20 (12 by maintainers)
Top Results From Across the Web
Fix High RAM Memory Usage Issue on Windows 11/10 [10 ...
10 Fixes for High (RAM) Memory Usage Issue on Windows 11/10 · 1. Close Unnecessary Running Programs/Applications · 2. Disable Startup Programs ·...
Read more >Windows 10 High Memory Usage [Causes and Solutions]
How to Fix Windows 10 High Memory Usage · Close unnecessary programs. · Disable startup programs. · Disable Superfetch service. · Increase virtual ......
Read more >High memory usage. How to fix them - Microsoft Community
1) See fixes for high RAM usage: · 2) Optimize Virtual Memory: · 3) Type Resource in Search box, open Resource Monitor as...
Read more >High Memory utilization and their root causes | Dynatrace
It may seem counterintuitive, but excessive cache usage can easily lead to performance problems. In addition to the typical problems, such as misses...
Read more >How To Fix High Memory/RAM Usage In Windows 10 - YouTube
In this this brief tutorial, I show two methods on how to resolve high memory usage in Windows 10.Windows 10 update problems have...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’ve fixed the problem in 8add545787140187bbce93f7e63b0bb70150a843. Like
atrous_conv2d
, I swap the width dimension out into the batch dimension and perform a regular convolution, but without padding in the height dimension. It’s now possible to train large stacks of dilation layers without running out of memory.Judging from occasional garbage collection log messages, I think the issue mentioned by @jyegerlehner is also valid. It would probably make sense to cut inputs to a fixed size.
I’m still in the process of finding good hyperparameters, and finding the cause of the generation issue in #13. After that, I’ll generate audio samples and provide statistics on how long it takes to train the model.