JA EN

Skip Connection

A shortcut path in neural networks that bypasses one or more layers, adding or concatenating the input directly to a later layer's output to mitigate vanishing gradients.

A skip connection (also called a shortcut or residual connection) bypasses one or more layers by adding or concatenating the input directly to a downstream layer's output. In ResNet, the residual formulation computes x + F(x), where F(x) is the learned transformation and x is the identity mapping.

In deep networks, gradients diminish exponentially during backpropagation. Skip connections provide an unimpeded gradient path, enabling stable training beyond 100 layers. ResNet-152 achieved 3.57% top-5 error on ImageNet, proving depth improves accuracy with proper shortcuts.

Skip connections are ubiquitous in modern architectures for super-resolution, generation, and detection. Additive (ResNet) vs. concatenative (DenseNet/U-Net) choice depends on task and memory constraints.

Related Terms

Related Articles