Here complex numbers are used for an eloquent gradient calculation - you can express all sorts of operations through just the 3 functions: `exp`, `log` and `add` defined over complex plane. Simplifies the code!
The added benefit is that all the variables become complex. As long as your loss is real-valued you should be able to backprop through your net and update the parameters.
The added benefit is that all the variables become complex. As long as your loss is real-valued you should be able to backprop through your net and update the parameters.
PyTorch docs mention that complex variables may be used "in audio and other fields": https://pytorch.org/docs/stable/notes/autograd.html#how-is-w...