Disabling loading consolidated Parquet `_metadata`
See original GitHub issueFor large Parquet datasets, loading the consolidated _metadata file, especially when it’s a remote file, can be quite slow. We should consider adding an flag to optionally disable reading _metadata. Initially I thought setting gather_statistics=False would do this, but that turned out to not be the case (at least for the pyarrow-dataset and fastparquet engines)
cc @rjzamora in case you have thoughts on this topic
Issue Analytics
- State:
- Created 2 years ago
- Comments:10 (9 by maintainers)
Top Results From Across the Web
Optimizing Access to Parquet Data with fsspec
This post details how the filesystem specification's new parquet model provides a format-aware byte-cashing optimization.
Read more >How to handle changing parquet schema in Apache Spark
When I disable writing metadata, Spark was said to infer the whole schema from the first file within the given Parquet path and...
Read more >fastparquet Documentation - Read the Docs
Fastparquet will automatically use metadata information to load such columns as categorical if the data was written by fastparquet/pyarrow.
Read more >Source code for dask.dataframe.io.parquet.core
By default will be inferred from the pandas parquet file metadata, if present. ... will load categories automatically for data written by dask/fastparquet, ......
Read more >Parquet file merging or other optimisation tips
I am loading a set of parquet files using : df = sqlContext. ... a feature to improve metadata caching in parquet specifically...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

@iameskild that doesn’t seem directly related. It is an error coming from parsing the
_metadatafile, and so having an option to not use that file (what this issue is about) would also help for your case as a workaround. But the fact that you get an error reading this file should have some cause / is a bug (for which I would recommend opening a separate issue).Closing this issue as an
ignore_metadata_fileoption was added in https://github.com/dask/dask/pull/8034 and @rjzamora opened https://github.com/dask/dask/issues/8058 to discuss improving metadata processing