The Pyramid Wavefront Sensor (PWFS) is one of the preferred choices for measuring wavefront aberrations for adaptive optics in highly sensitive astronomical applications. Despite its inherent high sensitivity, it has a low linearity that degrades the operational range of the phase estimation. This problem has been solved by optically modulating the PSF across the pyramid. However, modulation requires movable physical parts, requiring additional calibration while degrading the sensitivity in exchange for linearity. We created an End-To-End (E2E) trainable scheme that includes the PWFS model of propagation, an optical diffractive layer at the Fourier plane, and a state-of-the-art deep neural network that performs wavefront reconstruction. The joint training routine for the physical and digital trainable elements is conducted under a variety of atmospheric conditions simulated at different strengths along with its nth-Zernike decomposition for further comparison with the ones estimated by our model. We develop a variety of training schemes, varying turbulence ranges and balance between optical and digital layers. In this way, simulation results show an overall improvement in wavefront estimation even beyond the trained turbulence ranges, improving linearity while trying to maintain the sensitivity at weak turbulence, surpassing previous results that considered only one diffractive element and linear wavefront estimation. We are currently performing experimental closed-loop adaptive optics tests, while simulations are displaying encouraging results.
|