Support pandas 1.0
See original GitHub issuePandas 1.0 removed some functionalities previously deprecated, and some features seems to be broken. For example, trying to fit a FAMD gives the following error:
~/.pyenv/versions/3.8.2/envs/bioinformatics/lib/python3.8/site-packages/prince/one_hot.py in transform(self, X)
29
30 def transform(self, X):
---> 31 return pd.SparseDataFrame(
32 data=super().transform(X),
33 columns=self.column_names_,
TypeError: SparseDataFrame() takes no arguments
Issue Analytics
- State:
- Created 4 years ago
- Reactions:3
- Comments:6 (3 by maintainers)
Top Results From Across the Web
Installation — pandas 1.0.0 documentation
The easiest way to install pandas is to install it as part of the Anaconda distribution, a cross platform distribution for data analysis...
Read more >pandas 1.0.0 - PyPI
pandas is a Python package providing fast, flexible, and expressive data structures designed to make working with structured (tabular, ...
Read more >Pandas 1.0 brings big breaking changes - InfoWorld
The biggest change in Pandas 1.0 is dropping support for all versions of Python earlier than Python 3.6.1. Pandas dropped support for Python ......
Read more >What's New in Pandas 1.0? - Towards Data Science
The pandas development team just officially release pandas version 1.0.0 on Jan. ... Pandas 1.0.0 supports Python 3.6.1 and above.
Read more >python - Can't install pandas 1.0.1 - Stack Overflow
You need a Python version between 3.6 and 3.8. Update. Try: conda create -n py38 python=3.8.13 pandas=1.0.1 ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

Yeah I got to fix this…
Is working for me as well, thank you very much @MaxHalford !