question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Install Scrapy timeout using Pipenv

See original GitHub issue

It goes out when I install Scrapy with Pipenv

~/project/pycharm/ImoocSpider ᐅ pipenv install scrapy
Creating a virtualenv for this project…
Using /usr/local/opt/python/bin/python3.6 (3.6.5) to create virtualenv…
⠋Already using interpreter /usr/local/opt/python/bin/python3.6
Using base prefix '/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6'
New python executable in /Users/gaowenfeng/.virtualenvs/ImoocSpider-BHhXYODD/bin/python3.6
Also creating executable in /Users/gaowenfeng/.virtualenvs/ImoocSpider-BHhXYODD/bin/python
Installing setuptools, pip, wheel...done.

Virtualenv location: /Users/gaowenfeng/.virtualenvs/ImoocSpider-BHhXYODD
Creating a Pipfile for this project…
Installing scrapy…
Collecting scrapy
  Using cached https://files.pythonhosted.org/packages/db/9c/cb15b2dc6003a805afd21b9b396e0e965800765b51da72fe17cf340b9be2/Scrapy-1.5.0-py2.py3-none-any.whl
Collecting cssselect>=0.9 (from scrapy)
  Using cached https://files.pythonhosted.org/packages/7b/44/25b7283e50585f0b4156960691d951b05d061abf4a714078393e51929b30/cssselect-1.0.3-py2.py3-none-any.whl
Collecting PyDispatcher>=2.0.5 (from scrapy)
Collecting parsel>=1.1 (from scrapy)
  Using cached https://files.pythonhosted.org/packages/bc/b4/2fd37d6f6a7e35cbc4c2613a789221ef1109708d5d4fb9fd5f6f721a43c9/parsel-1.4.0-py2.py3-none-any.whl
Collecting Twisted>=13.1.0 (from scrapy)
Collecting w3lib>=1.17.0 (from scrapy)
  Using cached https://files.pythonhosted.org/packages/37/94/40c93ad0cadac0f8cb729e1668823c71532fd4a7361b141aec535acb68e3/w3lib-1.19.0-py2.py3-none-any.whl
Collecting lxml (from scrapy)
  Using cached https://files.pythonhosted.org/packages/a4/7c/0c333ccdaa04628b4df46d36b8a700d7810ffecd1371de796e2403fe9380/lxml-4.2.1-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting queuelib (from scrapy)
  Using cached https://files.pythonhosted.org/packages/4c/85/ae64e9145f39dd6d14f8af3fa809a270ef3729f3b90b3c0cf5aa242ab0d4/queuelib-1.5.0-py2.py3-none-any.whl
Collecting service-identity (from scrapy)
  Using cached https://files.pythonhosted.org/packages/29/fa/995e364220979e577e7ca232440961db0bf996b6edaf586a7d1bd14d81f1/service_identity-17.0.0-py2.py3-none-any.whl
Collecting six>=1.5.2 (from scrapy)
  Using cached https://files.pythonhosted.org/packages/67/4b/141a581104b1f6397bfa78ac9d43d8ad29a7ca43ea90a2d863fe3056e86a/six-1.11.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
  Using cached https://files.pythonhosted.org/packages/96/af/9d29e6bd40823061aea2e0574ccb2fcf72bfd6130ce53d32773ec375458c/pyOpenSSL-18.0.0-py2.py3-none-any.whl
Collecting incremental>=16.10.1 (from Twisted>=13.1.0->scrapy)
  Using cached https://files.pythonhosted.org/packages/f5/1d/c98a587dc06e107115cf4a58b49de20b19222c83d75335a192052af4c4b7/incremental-17.5.0-py2.py3-none-any.whl
Collecting constantly>=15.1 (from Twisted>=13.1.0->scrapy)
  Using cached https://files.pythonhosted.org/packages/b9/65/48c1909d0c0aeae6c10213340ce682db01b48ea900a7d9fce7a7910ff318/constantly-15.1.0-py2.py3-none-any.whl
Collecting zope.interface>=4.4.2 (from Twisted>=13.1.0->scrapy)
Collecting Automat>=0.3.0 (from Twisted>=13.1.0->scrapy)
  Using cached https://files.pythonhosted.org/packages/a3/86/14c16bb98a5a3542ed8fed5d74fb064a902de3bdd98d6584b34553353c45/Automat-0.7.0-py2.py3-none-any.whl
Collecting hyperlink>=17.1.1 (from Twisted>=13.1.0->scrapy)
  Using cached https://files.pythonhosted.org/packages/a7/b6/84d0c863ff81e8e7de87cff3bd8fd8f1054c227ce09af1b679a8b17a9274/hyperlink-18.0.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service-identity->scrapy)
  Using cached https://files.pythonhosted.org/packages/e9/51/bcd96bf6231d4b2cc5e023c511bee86637ba375c44a6f9d1b4b7ad1ce4b9/pyasn1_modules-0.2.1-py2.py3-none-any.whl
Collecting attrs (from service-identity->scrapy)
  Using cached https://files.pythonhosted.org/packages/41/59/cedf87e91ed541be7957c501a92102f9cc6363c623a7666d69d51c78ac5b/attrs-18.1.0-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
  Using cached https://files.pythonhosted.org/packages/a0/70/2c27740f08e477499ce19eefe05dbcae6f19fdc49e9e82ce4768be0643b9/pyasn1-0.4.3-py2.py3-none-any.whl
Collecting cryptography>=2.2.1 (from pyOpenSSL->scrapy)
  Using cached https://files.pythonhosted.org/packages/40/87/acdcf84ce6d25a7db1c113f4b9b614fd8d707b7ab56fbf17cf18cd26a627/cryptography-2.2.2-cp34-abi3-macosx_10_6_intel.whl
Requirement already satisfied: setuptools in /Users/gaowenfeng/.virtualenvs/ImoocSpider-BHhXYODD/lib/python3.6/site-packages (from zope.interface>=4.4.2->Twisted>=13.1.0->scrapy) (39.2.0)
Collecting idna>=2.5 (from hyperlink>=17.1.1->Twisted>=13.1.0->scrapy)
  Using cached https://files.pythonhosted.org/packages/4b/2a/0276479a4b3caeb8a8c1af2f8e4355746a97fab05a372e4a2c6a6b876165/idna-2.7-py2.py3-none-any.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=2.2.1->pyOpenSSL->scrapy)
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=2.2.1->pyOpenSSL->scrapy)
  Using cached https://files.pythonhosted.org/packages/8e/be/40b1bc2c3221acdefeb9dab6773d43cda7543ed0d8c8df8768f05af2d01e/cffi-1.11.5-cp36-cp36m-macosx_10_6_intel.whl
Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.2.1->pyOpenSSL->scrapy)
Installing collected packages: cssselect, PyDispatcher, six, w3lib, lxml, parsel, incremental, constantly, zope.interface, attrs, Automat, idna, hyperlink, Twisted, queuelib, asn1crypto, pycparser, cffi, cryptography, pyOpenSSL, pyasn1, pyasn1-modules, service-identity, scrapy
Successfully installed Automat-0.7.0 PyDispatcher-2.0.5 Twisted-18.4.0 asn1crypto-0.24.0 attrs-18.1.0 cffi-1.11.5 constantly-15.1.0 cryptography-2.2.2 cssselect-1.0.3 hyperlink-18.0.0 idna-2.7 incremental-17.5.0 lxml-4.2.1 parsel-1.4.0 pyOpenSSL-18.0.0 pyasn1-0.4.3 pyasn1-modules-0.2.1 pycparser-2.18 queuelib-1.5.0 scrapy-1.5.0 service-identity-17.0.0 six-1.11.0 w3lib-1.19.0 zope.interface-4.5.0

Adding scrapy to Pipfile's [packages]…
Pipfile.lock not found, creating…
Locking [dev-packages] dependencies…
Locking [packages] dependencies…

But It works when I install scrapy on a host pip3

~ ᐅ pip3 install scrapy
Collecting scrapy
  Using cached https://files.pythonhosted.org/packages/db/9c/cb15b2dc6003a805afd21b9b396e0e965800765b51da72fe17cf340b9be2/Scrapy-1.5.0-py2.py3-none-any.whl
Requirement already satisfied: parsel>=1.1 in /usr/local/lib/python3.6/site-packages (from scrapy) (1.4.0)
Requirement already satisfied: queuelib in /usr/local/lib/python3.6/site-packages (from scrapy) (1.5.0)
Requirement already satisfied: cssselect>=0.9 in /usr/local/lib/python3.6/site-packages (from scrapy) (1.0.3)
Requirement already satisfied: lxml in /usr/local/lib/python3.6/site-packages (from scrapy) (4.2.1)
Requirement already satisfied: pyOpenSSL in /usr/local/lib/python3.6/site-packages (from scrapy) (18.0.0)
Requirement already satisfied: six>=1.5.2 in /usr/local/lib/python3.6/site-packages (from scrapy) (1.11.0)
Requirement already satisfied: w3lib>=1.17.0 in /usr/local/lib/python3.6/site-packages (from scrapy) (1.19.0)
Requirement already satisfied: PyDispatcher>=2.0.5 in /usr/local/lib/python3.6/site-packages (from scrapy) (2.0.5)
Requirement already satisfied: Twisted>=13.1.0 in /usr/local/lib/python3.6/site-packages (from scrapy) (18.4.0)
Requirement already satisfied: service-identity in /usr/local/lib/python3.6/site-packages (from scrapy) (17.0.0)
Requirement already satisfied: cryptography>=2.2.1 in /usr/local/lib/python3.6/site-packages (from pyOpenSSL->scrapy) (2.2.2)
Requirement already satisfied: incremental>=16.10.1 in /usr/local/lib/python3.6/site-packages (from Twisted>=13.1.0->scrapy) (17.5.0)
Requirement already satisfied: constantly>=15.1 in /usr/local/lib/python3.6/site-packages (from Twisted>=13.1.0->scrapy) (15.1.0)
Requirement already satisfied: zope.interface>=4.4.2 in /usr/local/lib/python3.6/site-packages (from Twisted>=13.1.0->scrapy) (4.5.0)
Requirement already satisfied: Automat>=0.3.0 in /usr/local/lib/python3.6/site-packages (from Twisted>=13.1.0->scrapy) (0.7.0)
Requirement already satisfied: hyperlink>=17.1.1 in /usr/local/lib/python3.6/site-packages (from Twisted>=13.1.0->scrapy) (18.0.0)
Requirement already satisfied: pyasn1 in /usr/local/lib/python3.6/site-packages (from service-identity->scrapy) (0.4.3)
Requirement already satisfied: attrs in /usr/local/lib/python3.6/site-packages (from service-identity->scrapy) (18.1.0)
Requirement already satisfied: pyasn1-modules in /usr/local/lib/python3.6/site-packages (from service-identity->scrapy) (0.2.1)
Requirement already satisfied: idna>=2.1 in /usr/local/lib/python3.6/site-packages (from cryptography>=2.2.1->pyOpenSSL->scrapy) (2.7)
Requirement already satisfied: asn1crypto>=0.21.0 in /usr/local/lib/python3.6/site-packages (from cryptography>=2.2.1->pyOpenSSL->scrapy) (0.24.0)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != "PyPy" in /usr/local/lib/python3.6/site-packages (from cryptography>=2.2.1->pyOpenSSL->scrapy) (1.11.5)
Requirement already satisfied: setuptools in /usr/local/lib/python3.6/site-packages (from zope.interface>=4.4.2->Twisted>=13.1.0->scrapy) (39.0.1)
Requirement already satisfied: pycparser in /usr/local/lib/python3.6/site-packages (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.2.1->pyOpenSSL->scrapy) (2.18)
Installing collected packages: scrapy
Successfully installed scrapy-1.5.0

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
techalchemycommented, Jun 26, 2018

sorry for the trouble, you can modify the timeout with the PIPENV_TIMEOUT variable. Let us know if you run into anymore difficulty

0reactions
MarkGao11520commented, Jun 24, 2018
pipenv shell
pipenv install scrapy

Thank you for your reply. I completed the installation this way

Read more comments on GitHub >

github_iconTop Results From Across the Web

pipenv install failing due to timeout - Stack Overflow
PIPENV_INSTALL_TIMEOUT Max number of seconds to wait for package installation. Defaults to 900 (15 minutes), a very long arbitrary time. If it's ...
Read more >
Installation guide — Scrapy 2.7.1 documentation
TL;DR: We recommend installing Scrapy inside a virtual environment on all platforms. Python packages can be installed either globally (a.k.a ...
Read more >
Scraping data with Scrapy and PostgreSQL - JJ's World
Objective. Retrieve data from website using Scrapy. Store results in a PostgreSQL database. Prerequisites. pipenv installed (or ...
Read more >
Dominik Schwabe / wcm_group1 · GitLab
run pipenv install in the folders with a Pipfile bevor running pipenv run ... crawl only one site with scrapy crawl <spider> (...
Read more >
scrapy-splash - PyPI
Installation. Install scrapy-splash using pip: $ pip install scrapy-splash. Scrapy-Splash uses Splash HTTP API, so you also need ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found