Einops: Powerful library for tensor operations in deep learning

Tobias and I recently gave a talk at the OPIG retreat on tips for using PyTorch. For this we created a tutorial on Google Colab notebook (link can be found here). I remember rambling about the advantages of implementing your own models against using other peoples code. Well If I convinced you, einops is for you!!

Basically, einops lets you perform operations on tensors using the Einstein Notation. This package comes with a number of advantages a few of which I will try and summarise here:

1. It is easy to understand.

Here are two ways of doing the same operation, one in pure PyTorch and one using einops:

x = torch.randn(4,12,10,10)
y = x.view(x.shape[0], x.shape[1], -1)       # PyTorch version 
y = rearrange(x, 'b c h w -> b c (h w)')     # einops version 

While they would both get you to the same place, in the second one you can immediately see that we are grouping the last to dimensions of the tensor x.

There are basically only 3 operations in einops (rearrange, reduce, repeat). However, to cover what you can do with these three operations you would need to learn the syntax for tens of functions in PyTorch. For examples on how powerful these are see here.

The simplicity of einops results in two advantages, it makes your code easier to understand by other people and it makes einops quite easy to pick up. I found out about it when trying to understand some models implemented by lucidrains, and not only did it make my life a lot easier, but after a few days I was using it for my own models.

2. It is explicit.

When using einops you have to specify the dimensions of the output and input. This means that operations will only work for tensors that match the input number of dimensions. In this way, you ensure your code is doing exactly what you expect it to be doing at every stage.

For example, in the code snippet above if the input tensor (x) had only 3 dimensions, the PyTorch operation would still produce an output although it may not be the output you want. By using einops, you safeguard yourself against unexpected (potentially unwanted) behaviour.

3. It is framework independent.

NumPy, TensorFlow, JAX, PyTorch, Keras… It works in exactly the same way for all of those and more

Einops is a very powerful package that simplifies your deep learning code, so I hope I have convinced you that it is at least worth looking at. If you have read this far I hope this was not a complete waste of your time.

Author