Composition

Summing

Tensor networks cannot neatly describe some compound structures/operations.
For instance, residuals often take a different form than their alternate path.
Hence, it is easier view the whole as a sum of compatible networks.

Networks are compatible when all outgoing wires are equal (same dimension and 'meaning').

Two compatible but differently-structured networks instantiating a 4th order tensor.

These boxes retain all properties of tensor networks, despite their internals being more intricate.

Even if networks can be merged natively, this logical split is often preferred for code and reasoning. Along the same lines, to avoid clutter, common modules are denoted as a colored box.

Doubling

Throughout, many diagrams contain several identical input wires, either due to composing layers or due to the innate structure of certain modules.

Stacking layers hierarchically results in exponentially many copied inputs.

This leads to highly symmetric networks, which leads to notational clutter. To avoid this, we introduce new notation which highlights such structure, called doubling.

Doubling removes duplicate paths, improving visual clarity.

Doubling breaks the property of being able to count the tensor order given loose wires.

It generally makes most sense to look at diagrams with doubling. However, sometimes the full diagrams are required to formally verify some statement.