file stays in memory / doesn't close?
See original GitHub issueI’m not sure what is going on exactly, but I suspect that files are being kept in memory. When reading a large number of files (~1000 files) in a loop for example:
i = 0
for file in file_list:
file2read = mypath + '//' + file
las = lasio.read(file2read)
lasDF=las.df()
myIndex = lasDF.index
lasDF['DEPTH'] = myIndex
lasDF['LIN_RES'] = np.log(lasDF['RESD_FINAL'])
output = np.concatenate((output, lasDF), axis=0)
i+=1
print(i/len(file_list)*100, "% complete. Read: ", file)
The memory usage seems to continuously climb. Is there a file.close() that needs to be performed by the user?
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
File stays in memory after being closed - python - Stack Overflow
The issue seems to be in the read() function as a pass does not cause any memory usage growth. While looping through the...
Read more >Windows 10 not releasing standby memory when required.
My system has 16GB of RAM and a max pagefile of 1000MB. The standby memory cache stays around 7GB even when my commit...
Read more >Memory held by the resources is not released even after ...
Hi,. I created a WPF application with byte array, memory stream and BitmapImage when a main page is loaded as in the below...
Read more >Windows 11 has a memory leak bug and here's how to fix it
If not, go ahead and close all the File Explorer instances you've opened (just right click the folder icon in the taskbar and...
Read more >Matlab doesn't release memory when variables are cleared
you can write a script to close matlab and continue with the next function in a new file, after saving variables to disk....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Version is: ‘0.24.1’ Some add’l info if it makes any difference: OS: Windows 10; 16GB Ram Total LAS file count: 1464 Total LAS input file size: 1.21 GB I’m wondering if it’s not getting stuck in the python garbage collector. For example I just ran about 455 files through and checked the gc.get_count() and it’s at (557,6,1), so might not be a lasio issue, but a setting on my side.
No worries, glad you have found the issue! 🚀