Wavefront sensing and phase retrieval for astronomical imaging
Thesis DisciplineElectrical Engineering
Degree GrantorUniversity of Canterbury
Degree NameDoctor of Philosophy
Images of astronomical objects captured by ground-based telescopes are distorted by the earth's atmosphere. The atmosphere consists of random time-varying layers of air of differing density and hence refractive index. These refractive index fluctuations cause wavefronts that propagate through the atmosphere to become aberrated, resulting in a loss in resolution of the astronomical images. The wavefront aberrations that are induced by the atmosphere can be compensated by either real-time adaptive optics, where a deformable mirror is placed in the optical path, or by computer post-processing algorithms on the distorted images. In an adaptive optics system, the wavefront sensor is the element that estimates the wavefront phase aberration. The wavefront cannot be measured directly, and instead an aberration is introduced to the optical path to produce two or more intensity distributions, from which the wavefront slope or curvature can be estimated. Wavefront sensing is one of the topics of this thesis. A number of computer post-processing algorithms exist to deblur astronomical images, such as phase diversity, deconvolution from wavefront sensing (DWFS) and phase retrieval, with improvements to the latter two published in this thesis. The pyramid wavefront sensor consists of a four-sided glass prism placed in the focal plane of the telescope, which subdivides the focal plane in four, and a relay lens which re-images the four sections of the focal plane to form four images of the aperture at the conjugate aperture plane. The wavefront slope is estimated as a linear combination of the aperture images. The pyramid sensor can be generalised to a class of N-sided glass prism wavefront sensors that subdivide the focal plane into N equal sections, forming N aperture images at the conjugate aperture plane. The minimum number of sides required to estimate the slope in two orthogonal directions is three, and the cone sensor is derived by letting N tend to infinity. Simulation results show that in the presence of photon, but not read, noise the cone sensor provides the best wavefront estimate. For the pyramid sensor, the wavefront is typically reconstructed from the estimate of the wavefront slope in two orthogonal directions. Some information is inherently lost when the four measurements (aperture images) are reduced to two slope estimates. A new method is proposed to reconstruct the wavefront directly from the aperture images, removing the intermediate step of forming the slope estimates. Reconstructing the wavefront directly from the images is shown through simulation of atmospheric phase screens to give a better wavefront estimate than reconstructing from the slope estimates. This result is true for all pyramid type sensors tested. The pyramid wavefront sensor can be generalised by placing the lenslet array at the focal plane to subdivide the complex field in the focal plane into more than four sections. Using this framework, the pyramid sensor can be considered as the dual of the Shack Hartmann sensor, which subdivides the aperture plane with a lenslet array, since the two sensors subdivide each one of a Fourier pair. Both sensors estimate the wavefront slope with a centroid operator on the low resolution images. Also, in both sensors there exists a trade-off between the spatial resolution obtainable and the accuracy of the slope estimates. This trade-off is determined by the size of the lenslets in the array for both sensors, and is inverted between the two sensors. Simulation results run in open loop demonstrate that the lenslet array at the aperture (Shack-Hartmann) and focal (pyramid) planes do provide wavefront estimates of equivalent quality. The lenslet array at the focal plane, however, can be modulated so as to increase its linear range and thus provide a better wavefront estimate than the Shack-Hartmann sensor in open loop simulations. Phase retrieval is a non-linear iterative technique that is used to recover the phase in the aperture plane from intensity measurements at the focal plane and other constraints. A novel phase retrieval algorithm, which subdivides the focal plane of the telescope with a lenslet array and uses the aperture images formed at the conjugate aperture plane as a magnitude constraint, is proposed. This algorithm is more heavily constrained than conventional phase retrieval or phase retrieval in conjunction with the Shack-Hartmann sensor, with constraints applied at three Fourier planes: the aperture, focal and conjugate aperture planes. The subdivision of the focal plane means that the ambiguity problem that exists in other phase retrieval algorithms between an object A(x,y) and its twin A* (x,y) is removed, and this is supported by simulation results. Simulation results also show that the performance of the algorithm is dependent on the starting point, and that starting with the linear estimate from the aperture images gives a better wavefront estimate than starting with zero phase. DWFS is a computer post-processing algorithm that combines the distorted image and wavefront sensing measurements in order to compensate the image for the atmospheric turbulence. An accurate calibration of the reference positions for the centroids of the Shack-Hartmann sensor is essential for an accurate estimate of the wavefront and hence astronomical object, with DWFS. The conventional method for estimating these reference positions is to image a laser beam through the Shack-Hartmann lenslet array but not through the atmosphere. An alternative calibration technique is to observe a single bright star and optimise the Strehl ratio with respect to the reference positions. Results using DWFS on data captured at the Observatoire de Lyon show that this new technique can provide wavefront estimates of similar quality as the grid calibration technique, but without the need for a separate calibration laser.