近日,奥地利维也纳工业大学Bouchet, Dorian团队揭示了复杂介质下深度学习显微镜的
人工神经网络已成为利用无序或随机光子系统复杂性的重要工具。最近的应用包括从通过复杂散射介质传播过程中被扰乱的光中恢复信息,特别是在无法测量确定性输入-输出传输矩阵的具有挑战性的情况下。这自然引发了一个问题,即信息理论对这种恢复过程施加了什么限制,以及神经网络是否真的能达到这个限制。
为了回答这些问题,研究组引入了一种无模型的方法来计算Cramér-Rao界,该界设定了人工神经网络可以运行的最终精度极限。例如,在一个原理验证实验中应用了这种方法,该实验使用通过无序介质传播的激光,证明卷积网络在定位隐藏在动态波动散射介质后面的反射目标的挑战性任务中接近最终精度极限。这种无模型方法通常适用于对任何深度学习显微镜的性能进行基准测试,推动算法发展,并将计量和成像技术的精度推向极限。
附:英文原文
Title: Model-free estimation of the Cramér–Rao bound for deep learning microscopy in complex media
Author: Starshynov, Ilya, Weimar, Maximilian, Rachbauer, Lukas M., Hackl, Gnther, Faccio, Daniele, Rotter, Stefan, Bouchet, Dorian
Issue&Volume: 2025-05-28
Abstract: Artificial neural networks have become important tools to harness the complexity of disordered or random photonic systems. Recent applications include the recovery of information from light that has been scrambled during propagation through a complex scattering medium, especially in the challenging case in which the deterministic input–output transmission matrix cannot be measured. This naturally raises the question of what the limit is that information theory imposes on this recovery process, and whether neural networks can actually reach this limit. To answer these questions, we introduce a model-free approach to calculate the Cramér–Rao bound, which sets the ultimate precision limit at which artificial neural networks can operate. As an example, we apply this approach in a proof-of-principle experiment using laser light propagating through a disordered medium, evidencing that a convolutional network approaches the ultimate precision limit in the challenging task of localizing a reflective target hidden behind a dynamically fluctuating scattering medium. The model-free method introduced here is generally applicable to benchmark the performance of any deep learning microscope, to drive algorithmic developments and to push the precision of metrology and imaging techniques to their ultimate limit.
DOI: 10.1038/s41566-025-01657-6
Source: https://www.nature.com/articles/s41566-025-01657-6