Linear Algeba
See original GitHub issueBackpropagatable Linear Algebra methods are a vital part of Machine Learning including but not limited to Neural Nets. Here and there, requests for different LA methods pop up. More and more duplicate PRs regarding LA are submitted. This issue is an attempt to concentrate and coordinate the efforts and - if possible - establish a roadmap. Another motivation for this issue is this unanswered question regarding the future of tf.linalg
.
The following methods seem to be particularly useful:
- QR Decompostion
- Cholesky Decomposition
- LU Decomposition
- Solve
- LSTSQ
- SVD
- Eigen
- Useful for PCA and determining Main “Directions” modes of geometric bodies using Graph Laplacian.
- I can offer up an implementation from here but no backpropagation.
- Determinant
- Easily computed with one of the decompositions (SVD, QR, LU or even Cholesky in the symmetric, positive definite case).
- (Moore–Penrose) Inverse
- Easily computed with one of the decompositions.
It is my impression that the TFJS team is currently too busy to focus on Linear Algebra which is perfectly understandable considering how quickly TFJS is developing and progressing. On top of that the PRs (especially mine) may not yet satisfy the TFJS standards. However without feedback that is hard to fix.
Would it be possible to add tf.linalg
as a future milestone to TFJS? If there are no intentions to add more tf.linalg
methods to tfjs-core
, would it be possible to initiate a new tfjs-linalg
sub-project for this purpose?
Issue Analytics
- State:
- Created 4 years ago
- Reactions:16
- Comments:14 (8 by maintainers)
Top GitHub Comments
@janosh Added it to the list as well.
TFJS - sadly - only supports
float32
. Andfloat32
is scarily inaccurate. The machine epsilon is only~1.1920928955078125e-7
, i.e. the next larger number after4
is~4.000000476837158
. Considering that, the results seem to be within the margin of error.I came here to ask this too…