question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Export not working for large backups

See original GitHub issue

I have configured /Export as export folder for the alternate backup location. This is working fine for small full backups (1Gb). Now my backup is about 2Gb and the export file is 0kb. The rror in the server log:

Traceback (most recent call last):
  File "/config/custom_components/auto_backup/__init__.py", line 339, in new_snapshot
    await self.copy_snapshot(data[ATTR_NAME], slug, backup_path)
  File "/config/custom_components/auto_backup/__init__.py", line 432, in copy_snapshot
    await self.download_snapshot(slug, destination)
  File "/config/custom_components/auto_backup/__init__.py", line 452, in download_snapshot
    file.write(await request.read())
  File "/usr/local/lib/python3.7/site-packages/aiohttp/client_reqrep.py", line 973, in read
    self._body = await self.content.read()
  File "/usr/local/lib/python3.7/site-packages/aiohttp/streams.py", line 362, in read
    return b''.join(blocks)
MemoryError

Is there a way that this can be fixed? I am using an RPI4 with 4 Gb of memory. Disk space is plenty available.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:10 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
arjen-wcommented, May 13, 2020

File is around 2Gb now and export is working fine now. Seems indeed a file size issue. FYI: I changed the retention period for InfluxDB to 2 weeks, so database will shrink more. I will also look at the information that will be in the HA db itself as that one is now about 1.5Gb.

1reaction
jcwilloxcommented, May 7, 2020

Thanks, I thought it would be something like that. Anyway, this should be fixed in the latest release (0.5.2), feel free to re-open if it’s not.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Export Backup Taking too long(Delay in backup by ten hours)
I have a problem regarding the oracle nightly export backup. earlier it used to take around 9 hours for 400GB database export backup...
Read more >
'Get' fails for exporting large backup files [#2788995] | Drupal.org
The "Export" button correctly creates a copy of a site backup file in /data/disk/o1/backup-exports, and the "Get" button downloads that file ...
Read more >
If you're unable to share an item in Final Cut Pro - Apple Support
If a backup or Final Cut Pro camera archive of the source media isn't available, you may need to record the clip again....
Read more >
Export Backup Data from Salesforce
Large exports are broken up into multiple files. To download the zip file, follow the link in the email or click Data Export....
Read more >
XML Backups and Space Export/Import Troubleshooting
If you are running into problems with XML backups - whether memory related ... Cannot Restore XML Backup due to Value too Large...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found