The AI Digest
Subscribe
CNN
Large Kernel
ConvNets
Visual Processing
Scaling Up CNN Kernel Sizes

PeLK: Parameter-efficient Large Kernel ConvNets with Peripheral Convolution

The study examines recent advances in CNNs that utilize large kernels for improved performance. The direct scaling of kernel sizes in CNNs leads to several constraints including increased parameters and complexity. The research introduces peripheral convolution, a design which harnesses human visual principles to reduce parameter counts significantly. The authors successfully increase kernel sizes up to 101x101, demonstrating consistent performance improvements.

  • Overcomes parameter and optimization problems
  • Utilizes human vision-inspired peripheral convolution
  • Efficiently scales up to unprecedently large 101x101 kernel sizes
  • Shows improved performance over contemporary Transformers and CNNs

This research paves the way for future CNN designs that can capitalize on larger kernel sizes without the typical trade-offs, potentially leading to more powerful visual processing systems.

Personalized AI news from scientific papers.