Questions? +1 (202) 335-3939 Login
Trusted News Since 1995
A service for global professionals · Thursday, July 10, 2025 · 830,286,153 Articles · 3+ Million Readers

Universal programming of 3D point spread functions for imaging

Artistic depiction of Spatial and Spectral Engineering of 3D Point-Spread-Functions Using Diffractive Optical Processors.

GA, UNITED STATES, July 10, 2025 /EINPresswire.com/ -- UCLA researchers have introduced a framework for synthesizing arbitrary, spatially varying 3D point spread functions (PSFs) using diffractive processors. This approach enables unique imaging capabilities—such as snapshot 3D multispectral imaging—without relying on spectral filters, axial scanning, or digital reconstruction methods. The proposed framework could open up transformative possibilities for computational imaging, optical sensing and spectroscopy, as well as 3D optical information processing.

Engineers at the UCLA Samueli School of Engineering introduced a framework for universal point spread function (PSF) engineering, enabling the synthesis of arbitrary spatially varying 3D PSFs using diffractive optical processors. This development represents a significant step toward highly adaptable and programmable optical imaging systems.

PSF engineering plays a significant role in modern microscopy, spectroscopy and computational imaging. Conventional techniques typically employ static phase masks at the pupil plane, which constrain the complexity and flexibility of the achievable PSF structures. The approach developed at UCLA enables arbitrary 3D PSF engineering through a series of passive surfaces optimized using deep learning algorithms, forming a physical diffractive optical processor.

Through rigorous numerical simulations, the researchers demonstrated that such diffractive processors can approximate arbitrary linear transformations between the 3D optical intensity distributions within the input and output volumes by synthesizing arbitrary 3D PSFs. This capability allows for precise control of light distribution in three dimensions, enabling sophisticated optical functions at the diffraction limit of light.

The framework would enable advanced imaging modalities, such as snapshot 3D multispectral imaging, achieved without the need for spectral filters, mechanical axial scanning, or digital reconstruction. These features are enabled through joint spatial and spectral PSF engineering, making the framework highly versatile.

This work marks a significant stepping-stone for future advances in computational imaging, optical sensing, and 3D optical information processing. Potential applications include compact multispectral imagers, high-throughput 3D microscopy platforms, and novel optical data encoding systems.

The study was conducted by Dr. Md Sadman Sakib Rahman and Dr. Aydogan Ozcan of the UCLA Electrical and Computer Engineering Department.

DOI
10.1038/s41377-025-01887-x

Original Source URL
https://doi.org/10.1038/s41377-025-01887-x

Funding information
UCLA researchers acknowledge the funding of the US Army Research Office (ARO).

Lucy Wang
BioDesign Research
email us here

Powered by EIN Presswire

Distribution channels: Science, Technology

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Submit your press release