PaperSwipe

Video-rate gigapixel ptychography via space-time neural field representations

Published 4 weeks agoVersion 1arXiv:2511.06126

Authors

Ruihai Wang, Qianhao Zhao, Zhixuan Hong, Qiong Ma, Tianbo Wang, Lingzhi Jiang, Liming Yang, Shaowei Jiang, Feifei Huang, Thanh D. Nguyen, Leslie Shor, Daniel Gage, Mary Lipton, Christopher Anderton, Arunima Bhattacharjee, David Brady, Guoan Zheng

Categories

physics.opticseess.IV

Abstract

Achieving gigapixel space-bandwidth products (SBP) at video rates represents a fundamental challenge in imaging science. Here we demonstrate video-rate ptychography that overcomes this barrier by exploiting spatiotemporal correlations through neural field representations. Our approach factorizes the space-time volume into low-rank spatial and temporal features, transforming SBP scaling from sequential measurements to efficient correlation extraction. The architecture employs dual networks for decoding real and imaginary field components, avoiding phase-wrapping discontinuities plagued in amplitude-phase representations. A gradient-domain loss on spatial derivatives ensures robust convergence. We demonstrate video-rate gigapixel imaging with centimeter-scale coverage while resolving 308-nm linewidths. Validations span from monitoring sample dynamics of crystals, bacteria, stem cells, microneedle to characterizing time-varying probes in extreme ultraviolet experiments, demonstrating versatility across wavelengths. By transforming temporal variations from a constraint into exploitable correlations, we establish that gigapixel video is tractable with single-sensor measurements, making ptychography a high-throughput sensing tool for monitoring mesoscale dynamics without lenses.

Video-rate gigapixel ptychography via space-time neural field representations

4 weeks ago
v1
17 authors

Categories

physics.opticseess.IV

Abstract

Achieving gigapixel space-bandwidth products (SBP) at video rates represents a fundamental challenge in imaging science. Here we demonstrate video-rate ptychography that overcomes this barrier by exploiting spatiotemporal correlations through neural field representations. Our approach factorizes the space-time volume into low-rank spatial and temporal features, transforming SBP scaling from sequential measurements to efficient correlation extraction. The architecture employs dual networks for decoding real and imaginary field components, avoiding phase-wrapping discontinuities plagued in amplitude-phase representations. A gradient-domain loss on spatial derivatives ensures robust convergence. We demonstrate video-rate gigapixel imaging with centimeter-scale coverage while resolving 308-nm linewidths. Validations span from monitoring sample dynamics of crystals, bacteria, stem cells, microneedle to characterizing time-varying probes in extreme ultraviolet experiments, demonstrating versatility across wavelengths. By transforming temporal variations from a constraint into exploitable correlations, we establish that gigapixel video is tractable with single-sensor measurements, making ptychography a high-throughput sensing tool for monitoring mesoscale dynamics without lenses.

Authors

Ruihai Wang, Qianhao Zhao, Zhixuan Hong et al. (+14 more)

arXiv ID: 2511.06126
Published Nov 8, 2025

Click to preview the PDF directly in your browser