question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

'baqup' consistently fails due to 503: Over Rate Limit

See original GitHub issue
Traceback (most recent call last):
  File "main.py", line 272, in <module>
    main()
  File "main.py", line 73, in main
    _run_backup(client, output_directory, args.root_folder_id)
  File "main.py", line 83, in _run_backup
    client, output_directory, 0)
  File "main.py", line 120, in _descend_into_folder
    client, folder_output_path, depth + 1)
  File "main.py", line 120, in _descend_into_folder
    client, folder_output_path, depth + 1)
  File "main.py", line 122, in _descend_into_folder
    thread = client.get_thread(child["thread_id"])
  File "/Users/nrser/dev/gh/quip/quip-api/samples/baqup/quip.py", line 247, in get_thread
    return self._fetch_json("threads/" + id)
  File "/Users/nrser/dev/gh/quip/quip-api/samples/baqup/quip.py", line 788, in _fetch_json
    raise QuipError(error.code, message, error)
quip.QuipError: 503: Over Rate Limit

I went in and added retry with multiplicative back-off up to about five minutes total… it still can’t make it through. Sometimes it gets in a state where it can’t even start up, just 503: Over Rate Limit again and again on every request.

We realized that we have a ton of very important notes in Quip, and really need to back them up. Probably have somewhere between hundreds to low thousands of docs.

What’s the deal?

I’m on Python 2.7.14, MacOS 10.12.6 if it makes any difference.

Issue Analytics

  • State:open
  • Created 5 years ago
  • Comments:5

github_iconTop GitHub Comments

2reactions
nrsercommented, Dec 7, 2018

Hey @ogyet-stockpile - @squigsly and I hacked up a fork to work for us. The commits could use clean up and it should be on a branch and such but if you need something now it’s there.

Changes:

1. Monitors rate limits and sleeps when it runs out.

Rate limit info is returned in the headers of each HTTP response from the Quip API. I think the numbers are 50 / minute and 900 / hour, and it gives you a timestamp when they will reset that we use to calculate how long to snooze for (heads up, it can be like 40 minutes sometimes… we just let the thing go and went out for the evening).

You can check the code here:

https://github.com/nrser/quip-api/blob/master/python/quip.py#L788

To enable the feature, construct quip.QuipClient with our new use_rate_limiting=True flag.

2. Adds a half-ass’d resume to baqup via a read request file cache

Our version of the baqup script adds a new --cache_directory=PATH arg. When it’s present, all new reads from the API are written to files in the PATH directory, and all reads will check there and use that data if it’s present instead of hitting the API.

This lets baqup avoid the API hits and tear through entities it’s already backed up when re-run. Together with (1), which you can enable with our new --use_rate_limiting flag, we were able to finally get through a full back-up; about 2.5K requests for ~100MB of data. You also get all the API responses and blob contents and header metadata in the cache directory if you want to back that up as well or otherwise manipulate it locally.

Just manually delete the cache directory or change the PATH value to bust the cache. You could also go through there manually or with a script or something if you want to selectively remove items.

0reactions
nrsercommented, Sep 3, 2019

@jrgoodner Yeah there’s a link to one of the files in my comment, repo’s at:

https://github.com/nrser/quip-api

No promises about code quality; it was a bit of a dirty hack to get it working.

However

It turns out that Quip has batch API that let you breeze through the whole thing! IDK if they weren’t there back in December or what. Also I’ve only used them for the JSON data, not sure if they exist for blobs but they make a massive difference over the one request per entity that the Quip python lib at least used to use.

If you need to back up a decent amount or often I would check out hacking those in. Just FYI.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Avoid getting throttled or blocked in SharePoint Online
Throttled requests count towards usage limits, so failure to honor ... The RateLimit headers are returned on a best-efforts basis, ...
Read more >
What Is The 503 HTTP Status Code And How To Fix It
503 is a server error that is due to the hosting service being temporarily overloaded, or the website is undergoing maintenance. However, in ......
Read more >
503: Instagram is rate limiting your requests - Stack Overflow
HTTP 503 response stands for Service Unavailable. The error message is misleading as it suggest that you are hitting your rate limit.
Read more >
Rate Limiting | Service Infrastructure Documentation
If you use the services.allocateQuota method, your service must ignore 500 , 503 and 504 errors without any retry. To prevent a hard...
Read more >
Wasabi Errors - Veeam R&D Forums
10/6/2019 9:47:43 AM :: Failed to offload backup Error: Amazon ... across both EU and US DCs and the ingest rate was temporary...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found