question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

File Datalake Client get_path() iterator returning no more than 5000 blob paths

See original GitHub issue

Bug Description get_path() returning up to 5000 blob paths, even with max_results set to something > 5000.

Reproduction cred = ClientSecretCredential(tenant_id, client_id, client_secret) storage_account_name = 'ADLSAccount' path = 'dir' service_client = DataLakeServiceClient(account_url='{}://{}.dfs.core.windows.net'.format('https', storage_account_name), credential=cred) file_system_client = service_client.get_file_system_client(file_system='samplecontainer') result = file_system_client.get_paths(path, recursive=True, max_results=10000)

Only 5000 paths are included in ‘path’.

Expected Behavior Datalake service pagination logic should support client-side configuration, at least for the Python client.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
xiangyan99commented, Feb 8, 2021

Thanks for the feedback, we’ll investigate asap.

0reactions
msftbot[bot]commented, Dec 11, 2021

Hi, we’re sending this friendly reminder because we haven’t heard back from you in a while. We need more information about this issue to help address it. Please be sure to give us your input within the next 7 days. If we don’t hear back from you within 14 days of this comment the issue will be automatically closed. Thank you!

Read more comments on GitHub >

github_iconTop Results From Across the Web

azure.storage.filedatalake.FileSystemClient class
Returns all user-defined metadata and system properties for the specified file system. The data returned does not include the file system's list of...
Read more >
Azure Data Factory - Copy files using a CSV with filepaths
This is a complex scenario for Azure Data Factory. Also as you mentioned there are more than 5000 file paths records in your...
Read more >
Azure Data Lake Storage Connector Module Connector ...
Specifies which content encodings are applied to the file. This value is returned to the client when the Read Path operation is performed....
Read more >
Get Metadata recursively in Azure Data Factory
keep going until the end of the queue – i.e. when every file and folder in the tree has been “visited”. Now I'll...
Read more >
Azure Storage Blob Service - Apache Camel
If the request does not specify maxResultsPerPage or specifies a value greater than 5,000, the server will return up to 5,000 items. Integer....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found