question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Sparse] Support sparse matrix computation

See original GitHub issue

Objective

  • To reduce unnecessary work load in computing convolution over large amount of zeros.
  • To reduce memory/cache footprint

TODOs

  • Step 0 : Summarize the design decision
  • Step 1 : Update the PR (#1289) to support the following features
  • Create tvm.contrib.sparse.CSRNDArray and tvm.contrib.sparse.placeholder for creating sparse tensor.
  • Provide conversion between numpy.ndarray and tvm.contrib.sparse.CSRNDArray
  • Implement topi.sparse.csrmv and topi.sparse.csrmm as SpMV and SpMM, and check correctness with dense tensor operations
  • Support some other sparse tensor computation (e.g. relu, batch_norm, flatten)
  • Demonstrate sparse approximation based on MobileNetV2, and compare performance with dense matrix computation baseline.
  • Write a blogpost introducing this feature.

Proposed API Changes

  • There will no change to original Tensor object.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:2
  • Comments:9 (8 by maintainers)

github_iconTop GitHub Comments

2reactions
liangfucommented, Jul 3, 2018

Hi @fredrikbk , thanks for your explanation. I think we should refer to taco for the code generation part in tvm, especially for the unified sparse tensor structure. However, we would very likely to support the runtime for cuda, opencl and many other backends, therefore I think it is necessary to implement code generation in tvm.

In my observation, there are two categories of algorithms that extensively use sparse tensor operations to speed up convolution. First, people depends on high sparsity of weight tensor to speed up 2D image convolution, which requires retraining the network weights to increase the sparsity. Second, sparse tensors are used to represent point cloud dataset and its processing steps, where sparse tensors are used to represent features instead of weights in a convolution operator. In many cases, we might need to implement direct sparse convolution as well.

0reactions
tqchencommented, Sep 29, 2018

partly closed by #1289 let us open new thread for further updates

Read more comments on GitHub >

github_iconTop Results From Across the Web

Sparse Matrix Computation - an overview | ScienceDirect Topics
Furthermore, sparse matrix computation is a simple example of data-dependent performance behavior of many large real-world applications. Due to the large amount ...
Read more >
Sparse matrix - Wikipedia
In numerical analysis and scientific computing, a sparse matrix or sparse array is a matrix in which most of the elements are zero....
Read more >
Sparse Matrix Operations - MATLAB & Simulink - MathWorks
The computational complexity of sparse operations is proportional to nnz , the number of nonzero elements in the matrix. Computational complexity also ...
Read more >
A Gentle Introduction to Sparse Matrices for Machine Learning
A sparse matrix is a matrix that is comprised of mostly zero values. Sparse matrices are distinct from matrices with mostly non-zero values, ......
Read more >
Sparse matrices (scipy.sparse) — SciPy v1.9.3 Manual
SciPy 2-D sparse array package for numeric data. ... x * y no longer performs matrix multiplication, but element-wise multiplication (just like with...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found