Streamlit cannot display large dataframes; should at least show something useful to user about this.
See original GitHub issueCannot display large pandas dataframes.
User example:
Occasionally I would do something stupid like st.write(df) only to get this error.
EXCEPTION! Message ForwardMsg exceeds maximum protobuf size of 2GB: 3085985846 The reason - trying to display a DataFrame with millions of rows. ```
Solution
MVP: Clarify what the limits are. Are there limits to number of rows? File size?
User suggestion:
It would be better to detect this and just do df.head() internally and show a warning. At least this would not crash the streamlit process and bust the cache.
Preferred solution: No upper limits (besides what fits into RAM).
Issue Analytics
- State:
- Created 4 years ago
- Reactions:3
- Comments:8 (4 by maintainers)
Top Results From Across the Web
Scalabilty of streamlit for pandas
Yes. Large dataframes can slow down a Streamlit App! In genera displaying more than 100k elements can start to get sluggish. Please note...
Read more >Whether streamlit can handle Big Data Analysis - Random
We use ag-grid to display the data and we use Redis caching for our Dataframes that are generated via some Jupyter Notebooks.
Read more >After upgrade to the latest version now this error id showing up ...
After upgrading to the latest streamlit version now when i run the function it display the below error: ArrowInvalid: ('Could not convert ...
Read more >6 tips for improving your Streamlit app performance
Avoid downloading large, static models The solution is simple: when pushing your Streamlit app to production, bring your model and other assets ...
Read more >stlite, a port of Streamlit to Wasm, powered by Pyodide - #21 ...
And, what's amazing, is it would be the client user's deployed filesystem, ... Can't make st.dataframe() work though on Stlite Playground, ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@qutang Just circling back to say we have an improvement just merged into the development branch that should allow nearly any size of video (or audio or image) to work on Streamlit! Hopefully it’ll be packed with the next release. Thanks for your patience.
Leaving this open since @kantuni is still working on the Dataframe aspects.
Occasionally I would do something stupid like
st.write(df)
only to get this error.The reason - trying to display a DataFrame with millions of rows. It would be better to detect this and just do
df.head()
internally and show a warning. At least this would not crash the streamlit process and bust the cache.