Query return limited to 87500 rows
See original GitHub issueHey guys, having a problem with the following code:
client = bigquery.Client.from_service_account_json(google_secrets_file)
query = client.run_sync_query(query_sql_string)
query.allow_large_results = True
query.use_query_cache = False
query.run()
response = query.fetch_data()
all large queries seem to return exactly 87500 rows (even if I export to an interim table first and select * on them). I can’t see this number anywhere in the library or the documentation, so have no idea where this limit comes from or how to proceed!
Issue Analytics
- State:
 - Created 7 years ago
 - Comments:6 (5 by maintainers)
 
Top Results From Across the Web
How to Limit Rows in a SQL Server Result Set | LearnSQL.com
To limit rows in the result set, use ORDER BY with the optional OFFSET and FETCH clauses. First, the query sorts the rows...
Read more >The Complete Guide to SQL Row Limiting and Top-N Queries
In MySQL, you can use the LIMIT clause to restrict the number of rows returned by a SELECT query. You provide two parameters:...
Read more >DataTables | Table plug-in for jQuery
Add advanced interaction controls to your HTML tables the free & easy way · 1 Include these two files. CSS. JS · 2...
Read more >How to Limit or Select a Range of Query Results Using SQL
Sometimes it is useful to limit the number of rows that are returned from an SQL query. For example, if a user knows...
Read more >SQL LIMIT | Basic SQL - Mode Analytics
As you might expect, the limit restricts how many rows the SQL query returns. The default value is 100; when this box is...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

@fhoffa bump.
I was fully expecting to be able to download everything in one go, maybe there would be max_rows = None flag to set. I’m very surprised that the row limit is this small. Honestly, the ugliness doesn’t concern me as much as the slowness of it - downloading 3m rows takes about 5 minutes using the loop method. Am considering exporting to GCS then downloading from there, it just feels so roundabout though!