question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Basic linear algebra for complex numbers

See original GitHub issue

🚀 Feature

Support basic linear algebra for complex numbers.

Motivation

I talked with @sw005320 about https://github.com/nttcslab-sp/dnn_wpe and it turns out, that the matrix inversion implemented with real numbers is unstable. In a beamforming example @Emrys365 observed a performance difference of 5 dB in a signal to distortion ratio (SDR) where he replaced the inversion with numpy code (torch: 5dB, numpy 10dB).

I tried torch.inverse and torch.solve and interestingly they are working in 1.6.0.dev20200623+cpu (Not mentioned in https://github.com/pytorch/pytorch/issues/33152). Is it possible, to support torch.matmul and some other linear algebra functions?

I also tried to use backward after torch.solve and the code fails with the exception msg, that matmul is not implemented. Does someone know, how the gradient is defined in torch for complex numbers? Is it grad_real + j grad_imag or grad_real - j grad_imag? And how can I add/fix the gradient, when I find a broken implementation?

Pitch

Alternatives

Additional context

Currently, I am considering to jump between pytorch_complex and torch.autograd.Function:

def hermite(a):
    return a.transpose(-2, -1).conj()

def matmul(t1, t2):
    real1, imag1 = t1.real, t1.imag
    real2, imag2 = t2.real, t2.imag
    o_real = torch.matmul(real1, real2) - torch.matmul(imag1, imag2)
    o_imag = torch.matmul(real1, imag2) + torch.matmul(imag1, real2)
    return o_real + 1j * o_imag

class Solve(torch.autograd.Function):
    @staticmethod
    def forward(ctx, A, b):
        x, _ = torch.solve(b, A)
        ctx.save_for_backward(A, x)
        return x

    @staticmethod
    def backward(ctx, grad_output):
        A, x = ctx.saved_tensors        
        gb, _ = torch.solve(grad_output, hermite(A))
        gA = - matmul(gb, hermite(x))
        return gA, gb

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:17 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
mthrokcommented, Jul 10, 2020

@anjali411 @boeddeker I will coordinate the meeting then.

0reactions
mthrokcommented, Aug 3, 2021

In PyTorch 1.9, complex numbers are supported in most (if not all) of the linear algebra. Thanks for the feedback!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Complex Numbers
We can perform all of the usual arithmetic operations on complex numbers: add, subtract, multiply, divide, absolute value. There is also an important...
Read more >
Linear algebra with complex numbers - No bullshit guide
The complex numbers C are a field. Therefore we can do linear algebra over the complex numbers. We can define complex vectors Cn...
Read more >
Complex Number Operations - A First Course in Linear Algebra
The complex numbers α=a+bi α = a + b i and β=c+di β = c + d i are equal, denoted α=β α...
Read more >
What can complex numbers do that linear algebra cannot?
I think I have some good understanding of complex numbers. They are represented as a ...
Read more >
6.1: Complex Numbers - Mathematics LibreTexts
Just as a real number can be considered as a point on the line, a complex number z=a+bi can be considered as a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found