Minimum requirements for dependecy for Windows wheels
See original GitHub issueI got the CI from imbalanced-learn screaming at me after installing the RC. The reason is that the wheel is built with numpy 1.14.5 and conda installed numpy 1.13.3. Since the requirements are 1.8.2, there is no upgrade when installing scikit-learn.
@jnothman @amueller What are you thoughts?
Step that should reproduce the error:
conda create --name sklearn-test python=3.5
conda install numpy scipy -y -q
pip install --pre scikit-learn
python -c "import sklearn"
pytest --pyargs imblearn --cov-report term-missing --cov=imblearn
============================= test session starts =============================
platform win32 -- Python 3.5.2, pytest-3.7.3, py-1.5.4, pluggy-0.7.1
rootdir: C:\projects\imbalanced-learn, inifile: setup.cfg
plugins: cov-2.5.1
collected 0 items / 123 errors
=================================== ERRORS ====================================
____________________ ERROR collecting imblearn/__init__.py ____________________
__init__.pxd:998: in numpy.import_array
???
E RuntimeError: module compiled against API version 0xc but this version of numpy is 0xb
During handling of the above exception, another exception occurred:
..\imblearn\__init__.py:35: in <module>
from .base import FunctionSampler
..\imblearn\base.py:14: in <module>
from sklearn.base import BaseEstimator
C:\Miniconda35-x64\lib\site-packages\sklearn\__init__.py:64: in <module>
from .base import clone
C:\Miniconda35-x64\lib\site-packages\sklearn\base.py:13: in <module>
from .utils.fixes import signature
C:\Miniconda35-x64\lib\site-packages\sklearn\utils\__init__.py:12: in <module>
from .murmurhash import murmurhash3_32
sklearn\utils\murmurhash.pyx:26: in init sklearn.utils.murmurhash
???
__init__.pxd:1000: in numpy.import_array
???
E ImportError: numpy.core.multiarray failed to import
Issue Analytics
- State:
- Created 5 years ago
- Comments:19 (19 by maintainers)
Top Results From Across the Web
How to install, download and build Python wheels - ActiveState
Platform-specific wheels, which contain C extensions, and therefore must be precompiled as binary distributions for a specific Python version ...
Read more >What Are Python Wheels and Why Should You Care?
In this tutorial, you'll learn what Python wheels are and why you should care ... The idea is that the dependency should exist...
Read more >Build a wheel/egg and all dependencies for a python project
To get dependencies, you will want to create a requirements.txt file and run the following: pip wheel -r requirements.txt.
Read more >pip wheel - pip documentation v22.3.1
Build Wheel archives for your requirements and dependencies. Wheel is a built-package format, and offers the advantage of not recompiling your software ......
Read more >Manage required Python packages with requirements.txt
The recommended approach is to use a requirements.txt file ... For example, you may prefer to use pip wheel to compile a dependency...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Despite our requirements above it’s not possible to build scikit-learn on Python 3.7 with numpy 1.14.0 because it is unsupported by numpy. I would argue the same applies to all other Python versions (Py3.6, Py3.5) with the minimal versions of numpy/scipy to be determined (and that are unrelated to our minimum requirements).
On top of that, Windows has some additional complications, which explain for instance why in conda forge the minimum numpy requirement is higher on Windows than on Linux.
I don’t understand all the details, just saying that determining the minimum numpy/scipy versions for each Python version and OS is non trivial (or at least time consuming) and this is not necessarily related to the minimal numpy/scipy versions we have in the docs.
I merged MacPython/scikit-learn-wheels#11 . Sorry for the original mess, I think I was sloppy with the numpy build version to workaround a failing test on old numpy in some entry of the build matrix. The new setup with distinct versions of numpy for building and testing is better.