question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"403 Exceeded rate limits: too many table update operations for this table" when uploading just 2 hours every hour.

See original GitHub issue
  • Programming language: Python
  • OS: Linux / Google Cloud Function
  • Language runtime version: 3.7
  • Package version: 0.14.1

Error message: pandas_gbq.gbq.GenericGBQException: Reason: 403 Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas

I am uploading a very small amount of rows (less than 20) to a BQ table every hour through a cloud function running on a schedule. I’m receiving the error quite frequently. There are no other scripts writing to that table.
I get the error after a log that says:

2 out of 2 rows loaded.
1 out of 1 rows loaded.

Both those logs are for a single upload command.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
tswastcommented, Jan 11, 2022

Thanks for getting back to us.

There is a feature request open at https://github.com/googleapis/python-bigquery-pandas/issues/300 to use the streaming API for writes, which avoids some rate limits with load jobs (but at the expense of some complexity around writing to new/recreated tables). This hasn’t been implemented in pandas-gbq yet, but is available in the google-cloud-bigquery library with the insert_rows_from_dataframe method.

I’ll close this issue as it isn’t reproducible, but feel free to follow #300 or open a new issue if you have a reproducible example.

0reactions
nissankarkifmcommented, Jan 11, 2022

I don’t recall which of my pipeline this was. I believe I moved on to Google’s Datastore because of the Rate limit errors. What I can tell you is that the schema was not changing. I was adding to a BigQuery table every hour or so, about 20 rows that I was getting from an API. Sadly there is no way for me to reproduce this ATM.

Read more comments on GitHub >

github_iconTop Results From Across the Web

google bigquery - GoolgeBigQuery - Exceeded rate limits
If you're inserting rows in a lot of operations instead of a few ... Exceeded rate limits: too many table update operations for...
Read more >
Troubleshoot quota and limit errors | BigQuery - Google Cloud
For example, the message field might say Exceeded rate limits: too many table update operations for this table . In general, quota limits ......
Read more >
"403 Exceeded rate limits: too many table update operations ...
"403 Exceeded rate limits: too many table update operations for this table" when uploading just 2 hours every hour.
Read more >
bigquery.query throws rateLimitExceeded exception even in ...
"code" : 403,. "errors" : [ {. "domain" : "global",. "message" : "Exceeded rate limits: too many table update operations for this table....
Read more >
Rate Limits - Discord Developer Portal — Documentation
Integrate your service with Discord — whether it's a bot or a game or whatever your wildest imagination can come up with.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found