60

7

4

8

# TensorCast.jl

This package lets you work with many-dimensional arrays in index notation, by defining a few macros. The first is `@cast`, which deals both with "casting" into new shapes (including going to and from an array-of-arrays) and with broadcasting:

``````@cast A[row][col] := B[row, col]        # slice a matrix B into its rows, also @cast A[r] := B[r,:]

@cast C[(i,j), (k,ℓ)] := D.x[i,j,k,ℓ]   # reshape a 4-tensor D.x to give a matrix

@cast E[φ,γ] = F[φ]^2 * exp(G[γ])       # broadcast E .= F.^2 .* exp.(G') into existing E

@cast T[x,y,n] := outer(M[:,n])[x,y]    # generalised mapslices, vector -> matrix function
``````

Second, `@reduce` takes sums (or other reductions) over the indicated directions. Among such sums is matrix multiplication, which can be done more efficiently using `@matmul` instead:

``````@reduce K[_,b] := prod(a,c) L.field[a,b,c]           # product over dims=(1,3), and drop dims=3

@reduce S[i] = sum(n) -P[i,n] * log(P[i,n]/Q[n])     # sum!(S, @. -P*log(P/Q')) into exising S

@matmul M[i,j] := sum(k,k′) U[i,k,k′] * V[(k,k′),j]  # matrix multiplication, plus reshape
``````

All of these are converted into simple Julia array commands like `reshape` and `permutedims` and `eachslice`, plus a broadcasting expression if needed, and `sum` / `sum!`, or `*` / `mul!`. This means that they are very generic, and will (mostly) work well with StaticArrays, on the GPU via CuArrays, and with almost anything else. For operations with arrays of arrays like `mapslices`, this package defines gradients for Zygote.jl (similar to those of SliceMap.jl).

Similar notation is used by some other packages, although all of them use an implicit sum over repeated indices. TensorOperations.jl performs Einstein-convention contractions and traces:

``````@tensor A[i] := B[i,j] * C[j,k] * D[k]      # matrix multiplication, A = B * C * D
@tensor D[i] := 2 * E[i] + F[i,k,k]         # partial trace of F only, Dᵢ = 2Eᵢ + Σⱼ Fᵢⱼⱼ
``````

More general contractions are allowed by OMEinsum.jl, but only one term:

``````@ein W[i,j,k] := X[i,ξ] * Y[j,ξ] * Z[k,ξ]   # star contraction
W = ein" iξ,jξ,kξ -> ijk "(X,Y,Z)           # numpy-style notation
``````

Instead Einsum.jl and Tullio.jl sum the entire right hand side, but also allow arbitrary (element-wise) functions:

``````@einsum S[i] := -P[i,n] * log(P[i,n]/Q[n])  # sum over n, for each i (also with @reduce above)
@einsum G[i] := 2 * E[i] + F[i,k,k]         # the sum includes everyting:  Gᵢ = Σⱼ (2Eᵢ + Fᵢⱼⱼ)
@tullio Z[i,j] := abs(A[i+x, j+y] * K[x,y]) # convolution, summing over x and y
``````

These produce very different code for actually doing what you request: The macros `@tensor` and `@ein` work out a sequence of basic operations (like contraction and traces), while `@einsum` and `@tullio` simply write the necessary set of nested loops.

For those who speak Python, `@cast` and `@reduce` allow similar operations to `einops` (minus the cool video, but plus broadcasting) while Einsum / TensorOperations map very roughly to `einsum` / `opt_einsum`.

## Installation

You need Julia 1.0 or later:

``````] add TensorCast
``````

This was a holiday project to learn a bit of metaprogramming, originally `TensorSlice.jl`. But it suffered a little scope creep.

01/03/2019

1 day ago

48 commits