question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

onnx build script always depends on python2

See original GitHub issue

OS: fedora 27 build steps:

dnf install -y curl eigen3-devel gcc gcc-c++ zlib-devel cmake make ninja-build git python3-devel python3-numpy libunwind icu aria2 rsync python3-setuptools python3-wheel
export ONNX_ML=1
aria2c --download-result=hide -d /tmp/src  https://github.com/onnx/onnx/archive/v1.2.2.tar.gz
tar -xf /tmp/src/onnx-1.2.2.tar.gz -C /tmp/src
cd /tmp/src/onnx-1.2.2
python3 setup.py  bdist_wheel

output:

[  5%] Running C++ protocol buffer compiler on /tmp/src/onnx-1.2.2/.setuptools-cmake-build/onnx/onnx-ml.proto
/usr/bin/env: ‘python’: No such file or directory
--mypy_out: protoc-gen-mypy: Plugin failed with status code 127.
gmake[2]: *** [CMakeFiles/gen_onnx_proto.dir/build.make:62: onnx/onnx-ml.pb.cc] Error 1
gmake[1]: *** [CMakeFiles/Makefile2:100: CMakeFiles/gen_onnx_proto.dir/all] Error 2
gmake: *** [Makefile:130: all] Error 2
Traceback (most recent call last):
  File "setup.py", line 327, in <module>
    'backend-test-tools = onnx.backend.test.cmd_tools:main',
  File "/usr/lib/python3.6/site-packages/setuptools/__init__.py", line 129, in setup
    return distutils.core.setup(**attrs)
  File "/usr/lib64/python3.6/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/usr/lib64/python3.6/distutils/dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "/usr/lib64/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/usr/lib/python3.6/site-packages/wheel/bdist_wheel.py", line 199, in run
    self.run_command('build')
  File "/usr/lib64/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/usr/lib64/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/usr/lib64/python3.6/distutils/command/build.py", line 135, in run
    self.run_command(cmd_name)
  File "/usr/lib64/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/usr/lib64/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "setup.py", line 202, in run
    self.run_command('cmake_build')
  File "/usr/lib64/python3.6/distutils/cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "/usr/lib64/python3.6/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "setup.py", line 196, in run
    subprocess.check_call(build_args)
  File "/usr/lib64/python3.6/subprocess.py", line 291, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/cmake3', '--build', '.', '--', '-j', '8']' returned non-zero exit status 2.

I haven’t installed python 2. I only have python3. Therefore I cannot build onnx.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:15 (8 by maintainers)

github_iconTop GitHub Comments

2reactions
xentalcommented, Sep 14, 2018

@cjtang Solved this by cp /usr/bin/python3 /usr/bin/python

1reaction
houseroadcommented, Sep 19, 2018

or using ln instead of cp

Read more comments on GitHub >

github_iconTop Results From Across the Web

An empirical approach to speedup your BERT inference with ...
TorchScript is a way to create serializable and optimizable models ... As always, this depends on your hardware, a V100 is faster than...
Read more >
Accelerate and simplify Scikit-learn model inference with ...
This blog post introduces how to operationalize scikit-learn with ONNX, sklearn-onnx, and ONNX Runtime.
Read more >
Build for inferencing | onnxruntime
By default, ONNX Runtime's build script only generate bits for the CPU ARCH that the build machine has. If you want to do...
Read more >
Tutorial 8: Pytorch to ONNX (Experimental)
When the input model has custom op such as RoIAlign and if you want to verify the exported ONNX model, you may have...
Read more >
How to extract layer shape and type from ONNX / PyTorch?
I know how to do this using Netron by manually going to each layer, but I want to automate the process and hence...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found