Digital imaging sensor technology has continued to outpace development in optical technology in modern imaging systems. The resulting quality loss attributable to lateral chromatic aberration is becoming increasingly significant as sensor resolution increases; other classes of aberration are less significant with classical image enhancement (e.g. sharpening), whereas lateral chromatic aberration becomes more significant. The goals of higher-performance and lighter lens systems drive a recent need to find new ways to overcome resulting image quality limitations.
This work demonstrates the robust and automatic minimisation of lateral chromatic aberration, recovering the loss of image quality using both artificial and real-world images. A series of test images are used to validate the functioning of the algorithm, and changes across a series of real-world images are used to evaluate the performance of the approach.
The primary contribution of this work is introduced: a novel algorithm to robustly minimise lateral chromatic aberration in both calibration and real-world images. This is broken down into discrete steps and detailed. The basis of the algorithm uses chromatic correspondences to converge a set of distortion coefficients. Other contributions are then introduced: a second algorithm is developed to allow correction information to be correlated to the lens model and parameters. This information is subsequently stored in a database to allow offline correction of unseen images. Finally, an algorithm is developed to measure image fidelity in a way more relevant to how the Human Visual System processes information via spatial frequency analysis.
Algorithm validation is conducted through a series of steps to ensure correctness, then artificial and test images are analysed, and the quantification algorithm is applied to measure improvement. Lastly, the performance of this system is compared against prevailing methods and analysed.