In this work, we evaluate a especially crafted deep convolutional neural network to provide with estimations of the wavefront aberration modes directly from pyramidal wavefront sensor (PyWFS) images. Overall, the use of deep neural networks allow to improve the estimation performance as well as the operational range of the PyWFS, especially when considering cases of strong turbulence or bad seeing ratios D0/r0. Our preliminary results provide with evidence that by using neural nets, instead of the classic linear estimation methods, we can obtain a low modulation sensitivity response while extending the linearity range of the PyWFS, reducing the residual variance by a factor of 1.6 when dealing with a r0 as low as a few centimeters.
|