Image Fusion Algorithm Combining NSCT and PCNN

Resource Overview

NSCT-PCNN Fusion Framework: Multi-scale decomposition with biological neural network characteristics for high-quality image fusion

Detailed Documentation

An image fusion method integrating NSCT (Non-Subsampled Contourlet Transform) and PCNN (Pulse Coupled Neural Network)

The NSCT-PCNN fusion framework achieves high-quality image fusion through multi-scale decomposition and biological neural network properties. NSCT, as an improved contourlet transform, offers advantages of shift-invariance and directional selectivity, capable of decomposing source images into low-frequency subbands (approximation components) and multi-directional high-frequency subbands (detail components). PCNN simulates the synchronous firing characteristics of mammalian visual cortex neurons, automatically capturing image saliency features through pulse synchronization mechanisms.

The core workflow consists of three stages: Multi-scale Decomposition: Implement NSCT decomposition on two source images to obtain low-frequency components and multiple directional high-frequency components (algorithm implementation requires setting decomposition levels and directional filters). Low-frequency components represent overall image structure while high-frequency components contain edge and texture information. Component Fusion Strategy: For low-frequency components, apply regional energy-weighted fusion to preserve important energy areas (code implementation involves calculating local energy maps and weighted averaging) High-frequency components are input to the PCNN model, where neuronal firing frequency determines salient regions, preferentially selecting coefficients with higher firing counts (implementation requires PCNN parameter tuning and iteration control) Image Reconstruction: Reconstruct the final image by applying inverse NSCT transform to the fused high and low-frequency coefficients

Key advantages of this method: NSCT avoids pseudo-Gibbs phenomena caused by subsampling, while PCNN adaptively captures detailed features without manual parameter setting. However, computational complexity remains high, particularly with PCNN's iterative process being time-consuming. This approach suits scenarios demanding strict fusion quality but having flexible timing requirements (e.g., medical image fusion). Future improvements may consider PCNN fast convergence algorithms or parallel computing implementations to optimize performance.