question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Standalone pytorch build on windows 10 fails silently on imports after execution

See original GitHub issue
  • Nuitka version, full Python version, flavor, OS, etc. as output by this command (it does more than you think, and we are adding more all the time):

    python -m nuitka --version

0.7.6
Commercial: None
Python: 3.7.9 (tags/v3.7.9:13c94747c7, Aug 17 2020, 18:58:18) [MSC v.1900 64 bit (AMD64)]
Flavor: Unknown
Executable: C:\Users\begemot\build\build_test\venv\Scripts\python.exe
OS: Windows
Arch: x86_64
WindowsRelease: 10

Latest 0.8rc7 also gives same results

  • How did you install Nuitka and Python

Python was installed using official installer, downloaded from python.org. Nuitka was installed in virturalenv using pip install nuitka for 0.7.6 and pip install -U "https://github.com/Nuitka/Nuitka/archive/develop.zip" for 0.8rc7

  • The specific PyPI names and versions

    python -m pip freeze

numpy==1.21.5
torch==1.11.0
  • Many times when you get an error from Nuitka, your setup may be special

“Hello world” works perfectly. Small sample exmaples, using numpy also work as expected.

import torch

print("hello torch")
  • Provide in your issue the Nuitka options used
python -m nuitka --remove-output --plugin-enable=torch --plugin-enable=numpy --plugin-enable=pylint-warnings --windows-force-stdout-spec=stdout.log --windows-force-stderr-spec=stderr.log --standalone main_torch_min.py
  • Note if this is a regression

This is my first try to use Nuitka, but it seemed from docs and issues that building torch worked fine.

  • Error I get

Nuitka builds “hello torch” example without any errors.

And there is no error message after running compiled file, actually. EXE just fails silently. Nothing is printed. But “Application Error 1000” error is reported in windows event log (which contains no useful information also). stderr.log and stdout.log are also empty. Also tried compiling without --windows-force-*-spec keys with same result.

After turning on PYTHONVERBOSE=3 and PYTHONDEBUG=1 in environment it seems that execution stops on importing _bootlocale.py.

Full log

(venv) PS C:\Users\begemot\build\build_test\main_torch_min.dist> .\main_torch_min.exe
import _frozen_importlib # frozen
import _imp # builtin
import '_thread' # <class '_frozen_importlib.BuiltinImporter'>
import '_warnings' # <class '_frozen_importlib.BuiltinImporter'>
import '_weakref' # <class '_frozen_importlib.BuiltinImporter'>
# installing zipimport hook
import 'zipimport' # <class '_frozen_importlib.BuiltinImporter'>
# installed zipimport hook
import '_frozen_importlib_external' # <class '_frozen_importlib.FrozenImporter'>
import '_io' # <class '_frozen_importlib.BuiltinImporter'>
import 'marshal' # <class '_frozen_importlib.BuiltinImporter'>
import 'nt' # <class '_frozen_importlib.BuiltinImporter'>
import _thread # previously loaded ('_thread')
import '_thread' # <class '_frozen_importlib.BuiltinImporter'>
import _weakref # previously loaded ('_weakref')
import '_weakref' # <class '_frozen_importlib.BuiltinImporter'>
import 'winreg' # <class '_frozen_importlib.BuiltinImporter'>
import '_codecs' # <class '_frozen_importlib.BuiltinImporter'>
import 'codecs' # <class '_frozen_importlib.FrozenImporter'>
import 'encodings.aliases' # <class '_frozen_importlib.FrozenImporter'>
import 'encodings' # <class '_frozen_importlib.FrozenImporter'>
import 'encodings.utf_8' # <class '_frozen_importlib.FrozenImporter'>
import '_signal' # <class '_frozen_importlib.BuiltinImporter'>
import 'encodings.latin_1' # <class '_frozen_importlib.FrozenImporter'>
import '_abc' # <class '_frozen_importlib.BuiltinImporter'>
import 'abc' # <class '_frozen_importlib.FrozenImporter'>
import 'io' # <class '_frozen_importlib.FrozenImporter'>
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.cp37-win_amd64.pyd
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.pyd
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.py
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.pyw
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.pyc
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.cp37-win_amd64.pyd
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.pyd
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.py
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.pyw
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.pyc
import 'encodings.ascii' # <class '_frozen_importlib.FrozenImporter'>
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.cp37-win_amd64.pyd
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.pyd
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.py
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.pyw
# trying C:\Users\begemot\build\build_test\main_torch_min.dist\_bootlocale.pyc
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.cp37-win_amd64.pyd
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.pyd
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.py
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.pyw
# trying C:\Users\begemot\build\BUILD_~1\MAIN_T~2.DIS\_bootlocale.pyc
(venv) PS C:\Users\begemot\build\build_test\main_torch_min.dist>

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
lup-commented, Apr 5, 2022

As a workaround I tried to include torch as source-code module with this minimal options

python -m nuitka ^
       --mingw64 ^
       --nofollow-import-to=torch ^
       --include-package=numpy.core.multiarray ^
       --include-data-dir=C:\Users\begemot\build\build_test\venv\Lib\site-packages\torch=torch ^
       --standalone ^
       main_torch_min.py

Works smoothly, but I had to include multiprocessing module in my example (--import-package and --import-module skipped this module because it was “unused” without following torch).

import multiprocessing
import torch

print("hello torch")
0reactions
kayhayencommented, May 24, 2022

Latest Torch is supposed to work with the plugin activated on 0.8.1

Read more comments on GitHub >

github_iconTop Results From Across the Web

PyTorch compiled from source for Windows is failing when ...
Hello, I am getting this error when compiling PyTorch from source for Windows 10. Since my GPU (GTX Titan Black) has compute capability...
Read more >
How can I fix this pytorch error on Windows ...
Try to install PyTorch using pip: First create a conda environment using: conda create -n env_pytorch python=3.6 ; Now install PyTorch using pip:...
Read more >
Deep Learning with PyTorch
Our goal with PyTorch was to build the most flexible framework possible to express deep learning algorithms. We executed with focus and had...
Read more >
PyInstaller Documentation - Read the Docs
For platforms other than Windows, GNU/Linux and macOS, you must first build the bootloader for your platform: see Building the Bootloader. After the ......
Read more >
Torch Script — PyTorch master documentation - API Manual
Tracing a function will produce a ScriptModule with a single forward method that implements that function, and that contains no parameters. Example: import...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found