question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Generic einops function which infers which operation to call

See original GitHub issue

I noticed that rearrange, repeat, and reduce can all be inferred based on the pattern alone. Therefore a single generic function which makes calls to each is possible. I think this would be very nice for power users! For example,

einops.einop(tensor, pattern, reduction=None, **axes_lengths)

Here is how each operation can be inferred:

  • Same index names on each side? Then call rearrange.
    • e.g., i j k -> k j i, or time (i j) -> i j time are both recognized as patterns for rearrange.
  • An index given on the left side is missing on the right side? Then call reduce.
    • e.g., i j -> i, so obviously a reduction. Similarly, (h1 h2) (i j) -> h1 i.
  • An index was introduced on the right side, but was not given on the left side? Then call repeat.
    • e.g., i -> i j, so obviously a repeat. Or time i k -> time i h k

An error is raised for any of the following situations:

  • A rearrange or repeat is inferred, but a reduction operation is given
  • A reduction is inferred, but a reduction operation is missing
  • A repeat is inferred, but no values are given for axes_length

These would prevent unintended behavior. Errors could also be raised for when a user gives a value for axes_lengths, but does not have that index inside the equation.

I would be happy to implement this (and also #83). Let me know if interested.

Also would be nice for if #73 is included as well. Then this operation could be einop(x, [y, ], pattern, ...), with the einsum function inferred by the presence of two input tensors.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:16
  • Comments:14 (12 by maintainers)

github_iconTop GitHub Comments

12reactions
MilesCranmercommented, Mar 7, 2022

Pinging this thread again @arogozhnikov to see if your views have changed with increasing use of einops. It sounds like a lot of people would be interested in this.

Also - resolved merge conflicts in the PR.

Cheers, Miles

3reactions
cgarciaecommented, Jan 22, 2021

My 2 cents:

In my initial experience with einops the existence of multiple functions was actually confusing me instead of being useful. Like in einsum, I expected the einops “language” to just let me express how the data was now and what I wanted it look after and it should just do it, I believe the language is fairly intuitive.

As @MilesCranmer pointed out, the current division can sometimes be confusing, for example say you started with code like this:

x = repeat(x, "h w c -> batch c h w", batch=32)

Here you are tiling in the batch dimension but also transposing the channel dimension. Now imagine that for some reason you don’t want to do the tiling in the batch dimension anymore so you just delete it:

x = repeat(x, "h w c -> c h w")

This look good to the eye but its wrong because repeat doesn’t support (or rather defends against) the base-case of having cero repeat dimensions. Since you still want the transposition of the channel dimension c previously provided by repeat you are forced to switch to rearrange:

x = rearrange(x, "h w c -> c h w")
Read more comments on GitHub >

github_iconTop Results From Across the Web

Einops tutorial, part 1: basics
What's in this tutorial?¶. fundamentals: reordering, composition and decomposition of axes; operations: rearrange , reduce , repeat; how much ...
Read more >
Einops: Clear and Reliable Tensor Manipulations with ...
We propose einops notation: a uniform and generic way to manipulate tensor structure, that significantly improves code readability and ...
Read more >
Deep Learning Operations Reinvented (for Pytorch ... - Morioh
Three operations provided (einops tutorial shows those cover stacking, reshape, ... Even simple functions are defined differently by different frameworks
Read more >
Package List — Spack 0.17.2 documentation
Applications can link into AMD LibM library and invoke math functions instead ... Description: Automatically Tuned Linear Algebra Software, generic shared ...
Read more >
Orals
Einops : Clear and Reliable Tensor Manipulations with Einstein-like Notation ... for Discriminative Learning with Generative Modeling of Feature Incompletion.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found