Abstract
A differential evolution (DE) algorithm is applied to a recently developed spectroscopic objective function to select wavelengths that optimize the temperature precision of water absorption thermometry. DE reliably finds optima even when many-wavelength sets are chosen from large populations of wavelengths (here 120 000 wavelengths from a spectrum with 0.002 cm−1 resolution calculated by 16 856 transitions). Here, we study sets of fixed wavelengths in the 7280–7520 cm−1 range. When optimizing the thermometer for performance within a narrow temperature range, the results confirm that the best temperature precision is obtained if all the available measurement time is split judiciously between the two most temperature-sensitive wavelengths. In the wide temperature range case (thermometer must perform throughout 280–2800 K), we find (1) the best four-wavelength set outperforms the best two-wavelength set by an average factor of 2, and (2) a complete spectrum (all 120 000 wavelengths from 16 856 transitions) is 4.3 times worse than the best two-wavelength set. Key implications for sensor designers include: (1) from the perspective of spectroscopic temperature sensitivity, it is usually sufficient to monitor two or three wavelengths, depending on the sensor's anticipated operating temperature range; and (2) although there is a temperature precision penalty to monitoring a complete spectrum, that penalty may be small enough, particularly at elevated pressure, to justify the complete-spectrum approach in many applications.
Get full access to this article
View all access options for this article.
