Fail to install sparse attention and cpu-adam
See original GitHub issueWhen I used install.sh, I received the following warnings:
`[WARNING] fused lamb is NOT installed.
[WARNING] transformer kernels are NOT installed.
[WARNING] sparse attention is NOT installed.
[WARNING] cpu-adam (used by ZeRO-offload) is NOT installed.`
After changing the wheel installation to pure pip installation, the fused lamb and transformer kernel can be installed. But I still fail to install the sparse attention and cpu-adam.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:17 (8 by maintainers)
Top Results From Across the Web
Installation Details - DeepSpeed
The quickest way to get started with DeepSpeed is via pip, this will install the latest release of DeepSpeed which is not tied...
Read more >deepspeed - PyPI
Extremely long sequence length: Sparse attention of DeepSpeed powers an order-of-magnitude longer input sequence and obtains up to 6x faster execution ...
Read more >Understanding BigBird's Block Sparse Attention - Hugging Face
BigBird relies on block sparse attention instead of normal attention (i.e. BERT's attention) and can handle sequences up to a length of 4096 ......
Read more >AI千集-AI智能创作平台-openoker/DeepSpeed: DeepSpeed是一个 ...
We've found this report useful when debugging DeepSpeed install or compatibility issues. ds_report ... DS_BUILD_SPARSE_ATTN builds the sparse attention op ...
Read more >Healthy Lifestyle Score items - Menzies Institute for Medical ...
Failed to load latest commit information. ... Extremely long sequence length: Sparse attention of DeepSpeed powers an ... pip install deepspeed.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
And for installing Sparse Attention, you have the following dependencies:
Then you need to set the install flag: DS_BUILD_SPARSE_ATTN=1 (for sparse attention) DS_BUILD_CPU_ADAM=1 (for cpu adam)
and finally run ./install.sh
Please update us on the result.
I failed to import triton after pip install it. So I have to compile it from source with several other dependencies. After these, I can install sparse attention successfully.