"403 Exceeded rate limits: too many table update operations for this table" when uploading just 2 hours every hour.
See original GitHub issue- Programming language: Python
- OS: Linux / Google Cloud Function
- Language runtime version: 3.7
- Package version: 0.14.1
Error message:
pandas_gbq.gbq.GenericGBQException: Reason: 403 Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
I am uploading a very small amount of rows (less than 20) to a BQ table every hour through a cloud function running on a schedule. I’m receiving the error quite frequently. There are no other scripts writing to that table.
I get the error after a log that says:
2 out of 2 rows loaded.
1 out of 1 rows loaded.
Both those logs are for a single upload command.
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (1 by maintainers)
Top Results From Across the Web
google bigquery - GoolgeBigQuery - Exceeded rate limits
If you're inserting rows in a lot of operations instead of a few ... Exceeded rate limits: too many table update operations for...
Read more >Troubleshoot quota and limit errors | BigQuery - Google Cloud
For example, the message field might say Exceeded rate limits: too many table update operations for this table . In general, quota limits ......
Read more >"403 Exceeded rate limits: too many table update operations ...
"403 Exceeded rate limits: too many table update operations for this table" when uploading just 2 hours every hour.
Read more >bigquery.query throws rateLimitExceeded exception even in ...
"code" : 403,. "errors" : [ {. "domain" : "global",. "message" : "Exceeded rate limits: too many table update operations for this table....
Read more >Rate Limits - Discord Developer Portal — Documentation
Integrate your service with Discord — whether it's a bot or a game or whatever your wildest imagination can come up with.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

Thanks for getting back to us.
There is a feature request open at https://github.com/googleapis/python-bigquery-pandas/issues/300 to use the streaming API for writes, which avoids some rate limits with load jobs (but at the expense of some complexity around writing to new/recreated tables). This hasn’t been implemented in pandas-gbq yet, but is available in the
google-cloud-bigquerylibrary with the insert_rows_from_dataframe method.I’ll close this issue as it isn’t reproducible, but feel free to follow #300 or open a new issue if you have a reproducible example.
I don’t recall which of my pipeline this was. I believe I moved on to Google’s Datastore because of the Rate limit errors. What I can tell you is that the schema was not changing. I was adding to a BigQuery table every hour or so, about 20 rows that I was getting from an API. Sadly there is no way for me to reproduce this ATM.