gcsfs==0.6.1|0.6.2 'walk()' method breaking dask dataframe
See original GitHub issueWhat happened:
While walking the root of a parquet
folder initially created by pyspark, the fs.walk
method returns an empty string ''
in the files list.
('path/to/parquet/folder',
['Year=2019', 'Year=2020'],
['', '_SUCCESS'])
This behavior is breaking dask.dataframe.read_parquet('gs://...')
on multiple occasions (let me know if you want these errors), that’s when I tracked the error down to fs.walk
.
What you expected to happen:
The correct output should be
('path/to/parquet/folder',
['Year=2019', 'Year=2020'],
[ '_SUCCESS'])
Minimal Complete Verifiable Example:
import gcsfs
next(gcsfs.GCSFileSystem().walk('gs://path/to/parquet/folder/'))
Anything else we need to know?:
Reverting to gcsfs==0.6.0
, seemed to solve this problem. As far as I tested, the problem happens with 0.6.1
and 0.6.2
versions.
Environment:
- Dask version: 2.21.0
- GCSFS version: 0.6.2
- Python version: 3.7.6
- Operating System: Ubuntu 18.04
- Install method (conda, pip, source): pip
Issue Analytics
- State:
- Created 3 years ago
- Comments:16 (9 by maintainers)
Top Results From Across the Web
No results found
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Can you check on gcsfs master, please?
@rjurney , please open a new issue with the specific case that you are seeing