ssl problem when installing pip
See original GitHub issueI am installing TensorFlowOnSpark on a Google dataproc which also ready have Python and Openssl installed.
I followed the guidance for Yarn cluster and met these question when running get-pip.py
pip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.
Collecting pip
Could not fetch URL https://pypi.python.org/simple/pip/: There was a problem confirming the ssl certificate: Can't connect to HTTPS URL because the SSL module is not available. - skipping
Could not find a version that satisfies the requirement pip (from versions: )
No matching distribution found for pip
Issue Analytics
- State:
- Created 6 years ago
- Comments:16 (7 by maintainers)
Top Results From Across the Web
pip install fails with "connection error: [SSL - Stack Overflow
pip install is failing no matter the package. For example, > pip install scrapy also results in the SSL error. Vanilla install of...
Read more >How to fix - Python pip install connection error SSL ... - Jhooq
1. Root Cause of the problem · 2 Add --trusted-host param into installation command · 3. Fix the error by adding host to...
Read more >PIP connection Error : SSL CERTIFICATE VERIFY FAILED
The most common issue in installing python package in a company's network is failure of verification of SSL Certificate. Sometimes company blocks some ......
Read more >Python - pip install SSL certificate error - Jean Snyman
How to ignore the SSL Certificate errors. When you see an error like this, it's most likely that you are behind a proxy...
Read more >Pip Install - SSL Error: Certificate_Verify_Failed - ShellHacks
3 Replies to “Pip Install – SSL Error: Certificate_Verify_Failed” ... In my case, the root caused turned out to be an incorrect system...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yes, we require that each executor only runs one task at a time (and no dynamic allocation). The exact configuration depends on your spark version/setup, but you might be able to try
--executor-cores 1
.@leewyang I think I find the reason. in 1 worker and 1 ps mode. they are 2 tasks in the one executor so that they have same
ppid
and will only create one node so we cannot start both ps and workerby limiting one task for each executor. The problem is solved