Java heap space error when writing > 2 GB file content as Blob into Oracle Table
See original GitHub issueI have scenario to read a remote file of >2 GB file into an Oracle Blob Column. When ever the remote file size is > 1.2 GB I am am getting Java heap space error.
Below is my implementation? Can you please help?
object FileDownload: Table("FILE_DOWNLOAD") {
private val fileID = integer("FILE_ID")
val fileContent = blob("FILE_CONTENT")
}
val inStream = sftp.get(filePath)
transaction(db) {
FileDownload.insert {
it[fileContent] = ExposedBlob(inStream.readBytes())
}
inStream.close()
}
Issue Analytics
- State:
- Created a year ago
- Comments:11 (5 by maintainers)
Top Results From Across the Web
Java Heap Space Error while inserting large file. — oracle-tech
My code breaks if I try to insert a file bigger then couple of MBs (I would say more then 10 MBs) The...
Read more >Java Heap Space Exception, with big ammount of data, any ...
The aplication is throwing java heap exception, the memory consumption is raising over 600m and the CPU usage over 50% until the exeption....
Read more >Java heap space" error in AWS Glue
The "java.lang.OutOfMemoryError: Java heap space" error indicates that a driver or executor process is running out of memory. To determine ...
Read more >Downloading/Streaming Content with ORDS (File Downloads!)
One, that the content type for the data is variable – in this case I'm storing it as a column in the table,...
Read more >SQL-Developer: How to prevent the “Java Heap Space” error
When you are working with a lot of worksheets and reports or try to open large SQL files, you may run into “Java...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
It might become better with the next release. Now ExposedBlob stores inputStream by default, so please wait for the release and check will it helps or not. Don’t forget to replace
ExposedBlob(inStream.readBytes())
withExposedBlob(inStream)
But I was able to write the entire 5GB of data back into another local file from
inStream