Spectroscopic time domain OCT

Researcher: Alexander Meadway, PhD project, 2007- 2010

Funding: New York Eye and Ear Infirmary and University of Kent

Supervisors: A. Podoleanu and R. Rosen

Investigations were carried out into spectral channelling of time domain optical coherence  tomography (TD-OCT). Spectroscopic TD-OCT is possible as a broadband source is used in OCT, it can be used in several ways – for noise reduction, spectroscopic assessment of a sample and can lead to functional imaging.

One aspect of the project was the development of a method that can instantaneously measure dispersion mismatch in an OCT system1. Dispersion causes the OCT signal to lessen in amplitude and broaden in width, causing degradation of an image. The method is based upon spectral TD-OCT by processing the signal in multiple spectral windows. To achieve this, the single detector used in a TD-OCT system is replaced with a linear diode array, with each element detecting a discrete, narrow band of light. Each photodetector drives an LCI channel, with all channels operating in parallel.

The dispersion mismatch in the interferometer can be measured by analyzing the signals obtained in each spectral channel. Dispersion occurs because refractive index is dependent on wavelength, if there is a mismatch in the two arms of the interferometer then each wavelength of the spectrum used encounters a different optical path difference (OPD). This can be evaluated by locating the peak of each channel’s correlation function in time. Such a method can find application in locating dispersive elements in the sample and could be extended to determine oxygenation in the retina.