Failed to compile from sources
See original GitHub issueEnviroment:
- System: Ubuntun-20.04
- gcc: 9.3.0
- python: 3.8
- python: 1.9.1+cpu
- ipex: master
I followed the README instructions to compile the ipex sources but failed. I also tried to build ipex with torch-1.10+cpu, but failed too. The error message is as follows:
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/cpu/BatchNorm.cpp: In function ‘at::Tensor torch_ipex::autocast::frozen_batch_norm(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&)’:
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/cpu/BatchNorm.cpp:326:66: error: ‘AutocastCPU’ is not a member of ‘c10::DispatchKey’; did you mean ‘AutocastCUDA’?
326 | c10::impl::ExcludeDispatchKeyGuard no_autocastCPU(DispatchKey::AutocastCPU);
| ^~~~~~~~~~~
| AutocastCUDA
In file included from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/jit/runtime/operator.h:13,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/jit/ir/ir.h:7,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/jit/api/function_impl.h:4,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/jit/api/method.h:5,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/jit/api/object.h:6,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/jit/frontend/tracer.h:9,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/autograd/generated/variable_factories.h:12,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/types.h:7,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader_options.h:4,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/base.h:3,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader/stateful.h:3,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/data/dataloader.h:3,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/data.h:3,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include/torch/all.h:8,
from /root/anaconda3/lib/python3.8/site-packages/torch/include/torch/extension.h:4,
from /root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/cpu/BatchNorm.cpp:1:
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/cpu/BatchNorm.cpp: At global scope:
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/cpu/BatchNorm.cpp:336:32: error: ‘AutocastCPU’ is not a member of ‘c10::DispatchKey’; did you mean ‘AutocastCUDA’?
For pytorch-1.10, the error message is
root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp: In function ‘void torch_ipex::autocast::TORCH_LIBRARY_IMPL_init_aten_AutocastCPU_84(torch::Library&)’:
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp:164:72: error: could not convert template argument ‘& at::linalg_matrix_rank’ from ‘<unresolved overloaded function type>’ to ‘at::Tensor (*)(const at::Tensor&, double, bool)’
164 | &CPU_WrapFunction<DtypeCastPolicy::CAST_POLICY, SIG, SIG, &FUNC>:: \
| ^
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp:445:1: note: in expansion of macro ‘MAKE_REGISTER_FUNC’
445 | MAKE_REGISTER_FUNC(
| ^~~~~~~~~~~~~~~~~~
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp:165:13: error: ‘<expression error>::type’ has not been declared
165 | type::call); \
| ^~~~
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp:165:13: note: in definition of macro ‘MAKE_REGISTER_FUNC’
165 | type::call); \
| ^~~~
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp: At global scope:
/root/downloads/intel-extension-for-pytorch/torch_ipex/csrc/autocast_mode.cpp:168:15: error: template-id ‘get_op_name<at::Tensor(const at::Tensor&, double, bool), at::linalg_matrix_rank>’ for ‘std::string torch_ipex::autocast::get_op_name()’ does not match any template declaration
Issue Analytics
- State:
- Created 2 years ago
- Comments:24 (12 by maintainers)
Top Results From Across the Web
Error on compiling from source · Issue #34291 - GitHub
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux archlinux 5.3. · TensorFlow installed from (source or binary): source · TensorFlow ...
Read more >Unable to compile source files - Unreal Engine Forums
i have no idea whats happening here i cannot compile my project. error note say “unable to compile source file” i already try...
Read more >Error when compile source code - The UNIX and Linux Forums
i tried to compile a c++ file using the g++ command: g++ <filename>.cpp -out <output_file> and i received the following error message: ld.so.1:...
Read more >"SIP-14203: Failed to compile source file for CMXOrs ... - ERROR
SIP-14203: Failed to compile source file for CMXOrs. This issue occurs when a column name is named as class in a Base Object...
Read more >Titanium Android - Failed to compile Java source files
[ERROR] Failed to compile Java source files: [ERROR] [ERROR]. I tried to create a new app project, but the problem is the same....
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hi @morgan-bc , I created a branch for IPEX 1.10 release(release/1.10), it is on top of PyTorch 1.10. So you can install PyTorch 1.10 and then build IPEX 1.10 from the source to fix the error. Back to the error, the root cause is PyTorch master changed some operators’ signatures.
I suppose this issue has been fixed on the release/1.10. Please feel free to reopen it if not.