Packaged script execute permissions lost from v46.1.0 onwards
See original GitHub issueIssue
We use pipenv and, as a result, are forced to use the latest version of setuptools as it internally pins to the latest version.
We observed that executable scripts that are part of python packages have lost their execute flag from setuptools v46.1.0 onwards. The example to demonstrate this bug uses pyspark which includes a number of executable scripts in its package.
The issue was introduced by commit https://github.com/pypa/setuptools/commit/7843688bc33dd4e13e10130bc49da4c290fe7d7f where the copy_file()
function is now called with preserve_mode=False
. The change log states the reason for the change as:
Prevent keeping files mode for package_data build. It may break a build if user’s package data has read only flag.
Unfortunately, this has the side effect of stripping all execute permissions from files, meaning users can’t use the scripts “out of the box” - they have to set the execute permissions manually.
Demonstration Script
#!/bin/bash
set -eu
wget -nc https://files.pythonhosted.org/packages/9a/5a/271c416c1c2185b6cb0151b29a91fff6fcaed80173c8584ff6d20e46b465/pyspark-2.4.5.tar.gz
for version in "46.0.0" "46.1.0"; do
rm -rf .venv pyspark-2.4.5
tar xzf pyspark-2.4.5.tar.gz
virtualenv -q -p /usr/bin/python3.7 .venv
. .venv/bin/activate
python3 -m pip install --upgrade setuptools="==${version}" wheel
pushd pyspark-2.4.5
python3 setup.py -q bdist_wheel
pushd dist
unzip -q pyspark-2.4.5-py2.py3-none-any.whl
echo -e "\n\n${version}: Here are the permissions for spark-submit:\n"
ls -l ./pyspark/bin/spark-submit
echo -e "\n\n"
popd
popd
done
Expected result
-rwxr-xr-x 1 dave dave 1040 Feb 2 19:35 ./pyspark/bin/spark-submit
Actual Result
-rw-rw-r-- 1 dave dave 1040 Feb 2 19:35 ./pyspark/bin/spark-submit
Issue Analytics
- State:
- Created 3 years ago
- Reactions:9
- Comments:7
Top GitHub Comments
I’m facing this issue too on macOS 10.15.3 and Pipenv 2018.11.26. This code was working fine in 46.0.0 but started failing in 46.1.0:
I also filed this issue in Spark side to track SPARK-31231.