Issue with biased estimates from QR decomposition
See original GitHub issueHi again 😃
See issue: https://github.com/google-research/google-research/issues/436 that I posted on the main repository. Using the QR incorrectly produces results with significantly higher variance. There is quite an easy fix by simply doing
q, r = torch.qr(flattened)
# Make Q uniform according to https://arxiv.org/pdf/math-ph/0609050.pdf
d = torch.diag(r, 0)
ph = d.sign()
q *= ph
Issue Analytics
- State:
- Created 3 years ago
- Comments:9 (5 by maintainers)
Top Results From Across the Web
A class of biased estimators based on QR decomposition
Here, we propose a modified estimator based on the QR decomposition for solving the multicolinearity problem of design matrix, ...
Read more >Efficiency of the QR class estimator in semiparametric ...
In Section 3, a class of biased estimators is studied based on the QR decomposition. Then, properties of the proposed estimators are ...
Read more >Stat425 Applied Regression and Design F. Liang - Piazza
The QR decomposition. How is the LS estimate ˆβ solved in R? Denote the QR decomposition (also called the. QR factorization) of X...
Read more >A class of biased estimators based on QR - مقالات
Here, we propose a modified estimator based on the QR decomposition for solving the multicolinearity problem of design matrix, which makes the data...
Read more >Direction-of-arrival estimation using rank revealing QR ...
Direction-of-arrival estimation using rank revealing QR factorization ... MUSIC in terms of signal resolution, bias, and variance of the estimated bearings.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Haha they are doing about the same as the random matrices I would say. Haven’t had any great success with either variant yet though 😃 Im working on a completely unrelated task to nlp with them, so its a bit of work to adapt stuff.
Im working more on the theoretical side at the moment. I just realised when I generated these matrices that I got a biased estimate (in 2d i literally got no samples in the 4th quadrant but maybe that was someting wrong on my end). Anyway, for those experiments I reduced the variance with about ~50%.
In the practical experiments Im running myself im still using learnable matrices.