Deep learning (DL) has significantly transformed the field of computational imaging, offering powerful solutions to enhance performance and address a variety of challenges. Traditional methods often rely on discrete pixel representations, which limit resolution and fail to capture the continuous and multiscale nature of physical objects. Recent research from Boston University (BU) presents a novel approach to overcome these limitations.
As reported in Advanced Photonics Nexus, researchers from BU’s Computational Imaging Systems Lab have introduced a local conditional neural field (LCNF) network, which they use to address the problem. Their scalable and generalizable LCNF system is known as “neural phase retrieval”—” NeuPh” for short.
NeuPh leverages advanced DL techniques to reconstruct high-resolution phase information from low-resolution measurements. This method employs a convolutional neural network (CNN)-based encoder to compress captured images into a compact latent-space representation.