question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Fail to install sparse attention and cpu-adam

See original GitHub issue

When I used install.sh, I received the following warnings:

`[WARNING] fused lamb is NOT installed.

[WARNING] transformer kernels are NOT installed.

[WARNING] sparse attention is NOT installed.

[WARNING] cpu-adam (used by ZeRO-offload) is NOT installed.`

After changing the wheel installation to pure pip installation, the fused lamb and transformer kernel can be installed. But I still fail to install the sparse attention and cpu-adam.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:17 (8 by maintainers)

github_iconTop GitHub Comments

3reactions
arashasharicommented, Oct 6, 2020

And for installing Sparse Attention, you have the following dependencies:

  • llvm: sudo apt-get install llvm-9-dev
  • cmake: sudo apt-get install cmake
  • triton>=0.2.2: this will be handled by the installer script, if there was any issue, please install it through pip

Then you need to set the install flag: DS_BUILD_SPARSE_ATTN=1 (for sparse attention) DS_BUILD_CPU_ADAM=1 (for cpu adam)

and finally run ./install.sh

Please update us on the result.

1reaction
szhengaccommented, Oct 6, 2020

I failed to import triton after pip install it. So I have to compile it from source with several other dependencies. After these, I can install sparse attention successfully.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Installation Details - DeepSpeed
The quickest way to get started with DeepSpeed is via pip, this will install the latest release of DeepSpeed which is not tied...
Read more >
deepspeed - PyPI
Extremely long sequence length: Sparse attention of DeepSpeed powers an order-of-magnitude longer input sequence and obtains up to 6x faster execution ...
Read more >
Understanding BigBird's Block Sparse Attention - Hugging Face
BigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 ......
Read more >
AI千集-AI智能创作平台-openoker/DeepSpeed: DeepSpeed是一个 ...
We've found this report useful when debugging DeepSpeed install or compatibility issues. ds_report ... DS_BUILD_SPARSE_ATTN builds the sparse attention op ...
Read more >
Healthy Lifestyle Score items - Menzies Institute for Medical ...
Failed to load latest commit information. ... Extremely long sequence length: Sparse attention of DeepSpeed powers an ... pip install deepspeed.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found