Open Access
27 January 2022 Diffraction-based hyperspectral snapshot imager
Robin Hahn, Tobias Haist, Kristina Michel, Wolfgang Osten
Author Affiliations +
Abstract

We present a method of hyperspectral snapshot imaging that is based on diffraction in an intermediate image plane. In this intermediate image, plane diffracting microstructures are used to deflect light toward an iris that performs spectral filtering. The method is related to imaging spectrometry with a reduced spectral resolution. However, the spectral shifts of the signal are detected very accurately at a potentially low cost. Our prototype system is capable of detecting a spectral shift of 0.5 nm, and the spectral operating range reaches from 510 to 700 nm. Application-specific arrays of microgratings are lithographically manufactured and lead to custom spatial and spectral sampling of an (intermediate) image. Compared with filter-based hyperspectral snapshot imaging, this approach avoids the difficult manufacturing of the mosaic filter array, but the spatial and spectral resolutions will become coupled. This leads to an uncertainty product for spatial and spectral resolutions. We explain the basic principle and show an experimental verification based on a first laboratory prototype.

1.

Introduction

Multi- and hyperspectral imaging (HSI) are important technologies for a variety of applications. The first setups were realized as early as 1869 by Janssen1 for the observation of the solar corona. However, the technology was largely ignored until the 1980s due to a lack of detectors. Since then, the number of applications and technical approaches grew steadily.2,3

In addition to the well-established scanning line spectrometers, the so-called pushbroom systems, different single-shot or snapshot sensors have been proposed. In these sensors, a hyperspectral image of a two-dimensional (2D) scene is captured using a single exposure of an image sensor. A good review was given by Hagen and Kudenov.2

A commercially available approach is a spectral resolving filter array (SRFA). However, these detectors are rather expensive due to their complex fabrication.4,5 They are also particularly susceptible to fabrication tolerances.6,7 Another approach, proposed by Bacon8 and Courtes et al.,9 relies on spatial sampling using a microlens array in an intermediate image plane. This allows one to generate and capture a spectrum for each aperture on a conventional image sensor. Computed tomographic imaging uses (CTIS) the diffractive elements to generate multiple, overlapping, and spectrally separated copies of a scene. Numerical reconstruction, e.g., using expectation maximization algorithms, then leads to the final hyperspectral image.10

In this contribution, we propose a new method for snapshot HSI. The approach also relies on diffraction, but we use microgratings located in an intermediate image plane to achieve HSI. This leads to a relatively easy application-specific implementation. One of the main advantages of the proposed sensing principle is a quite large freedom of choice concerning spatial and spectral sampling in arbitrary image patches. Furthermore, compared with the IMEC SRFA sensor investigated by Hahn et al.,6,7 the method is superior in the detection of spectral shifts, is less sensitive to manufacturing tolerances, does not necessarily need a pixelwise calibration, is more robust to varying ray angles, and does not require any further spectral filters for proper operation. However, the proposed method leads to broader spectral channels at a given spatial resolution.

The method could be of interest in food quality control,11 irrigation control in agriculture,12 monitoring of vital parameters in medicine,13 determination of human tissue,14 or optical surface metrology.15

In Sec. 2, we present the basic principle and derive a signal model that leads to an uncertainty relation for the spatial and spectral resolutions in Sec. 3. In Sec. 4, we introduce our laboratory setup for a first experimental demonstration of the sensor principle before showing measurements in Sec. 5. A conclusion is given in Sec. 6.

2.

Sensor Concept

The working principle of the method is shown in Fig. 1. The scene to be analyzed is imaged onto an intermediate image plane using an image-sided telecentric imaging system in which an application-specific pattern of microgratings is located. This pattern is realized as a lithographically fabricated diffractive optical element (DOE). The local micrograting period d then determines the deflection angle of the ray bundles for a given wavelength λ. The chief ray is deflected according to the grating equation sinϕ=λ/d. The intermediate image is imaged onto the image sensor by the HSI unit, which is realized by a double-sided telecentric system. In principle, object-sided telecentricity is sufficient, but for most image sensors, it is advantageous if the angle of incidence of the light is small. The aperture stop D2 is used for spectral filtering.

Fig. 1

Principle of the hyperspectral detection. Microgratings in an intermediate image plane deflect the light toward a spectral filtering iris. Each micrograting is optimized for an individual application-dependent wavelength.

OE_61_1_015106_f001.png

As shown in Fig. 1, only parts of the spectrum can pass the aperture (green solid line). The blocked wavelength (red dashed line) can pass the aperture from other field positions, if the microgratings at these positions result in the suitable diffraction angle for the wavelength (red solid line). All microgratings with identical grating periods lead to the same angular deviation of the ray bundles and, therefore, the same spectral band being able to pass the aperture. On the monochrome image sensor, only the areas of the image with a wavelength that matches the design wavelength of the grating appear bright.

Of course, the width of the aperture stop determines the bandwidth for the spectral filtering. On the other hand, it also determines the lateral resolution of the imaging of the intermediate image onto the image sensor. In addition, the exit pupil diameter of the first imaging (object to intermediate image plane) should exactly fit the entrance pupil diameter of the HSI system. In other words, the aperture stop D1 of the first imaging system has to be imaged onto the aperture stop of the HSI system.

3.

Signal Model

In the following, we derive a one-dimensional (1D) estimation of the resulting spectral response for imaging a point (X,Y) in the object plane. We assume that, at a certain wavelength λ0, the light will pass the iris of the second imaging system (see Fig. 1).

A shift of the wavelength Δλ leads to a shift Δx of the ray bundle on the filter plane. Δx is approximated as

Eq. (1)

Δx=f1tanΔϕf1sinΔϕf1dΔλ,
where we denote the focal length of the lens behind the DOE by f1 and the change of the diffraction angle Δϕ. The grating period is denoted by d. Note that the approximation used in Eq. (1) is only valid for small angles.

For simplicity, we first describe the system in a 1D model. In this model, we can describe the filtering aperture with diameter D2 by a rect function with width D2. The amplitude distribution due to a certain object point also has a width of D2 for one wavelength λ0.

At a certain wavelength λ=λ0+Δλ, the corresponding rectangular amplitude distribution is shifted (dispersion at the grating) by Δx. If we are interested in the spectral full-width half-maximum (FWHM), the allowed shift is given as

Eq. (2)

D22=Δx=f1dΔλ.

The FWHM corresponds to ±Δλ; therefore,

Eq. (3)

FWHM=2·Δλ=D2df1.

For an arbitrary Δλ, the amplitude distribution Φ(Δλ) after the filter is

Eq. (4)

Φ(Δλ,d)=rect(x/D2)·rect(xΔxD2)dx

Eq. (5)

=rect(x/D2)·rect(xD2f1D2dΔλ)dx.

With x:=x/D2, we obtain

Eq. (6)

Φ(Δλ,d)=rect(x)·rect(f1D2dΔλx)dx.

By considering the y-axis symmetry of the rect function rect(x)=rect(x), it follows that

Eq. (7)

Φ(Δλ,d)=rect(x)·rect(f1D2dΔλx)dx

Eq. (8)

=rect(f1D2dΔλ)*rect(f1D2dΔλ)

Eq. (9)

=triang(f1D2dΔλ),
where we denote convolution as

Eq. (10)

f(x)=g(x)*h(x):=g(x)h(xx)dx,
and again the FWHM is given by Eq. (3).

With the 1D lateral resolution, corresponding to the well-known resolution limit 0.61·λ/NA for circular apertures, the resolution in the object plane is

Eq. (11)

ΔX=1.0·λf1D2.

We then find the uncertainty relation for spectral and lateral resolution

Eq. (12)

ΔX·FWHMλ·d.

For a lateral resolution of ΔX=50μm and a grating period d of 2.5μm, we obtain an FWHM of 30 nm at λ=600 nm.

One has to be careful about the interpretation of ΔX. This is the physical resolution defined by the optical system. However, if we want to use, e.g., nine (3×3) spectral channels, the spatial resolution would be effectively—as with the typical Bayer-like mosaics—reduced, in this example by a factor of at least three.

For a 2D imaging model, one has to replace the rect functions by circ functions. The triang result of Eq. (9) then becomes an arccos function, similar to the optical transfer function for aberration-free incoherent imaging for a circular pupil.16 The difference, however, is negligible, so in the following we continue using Eq. (9). The 1D model also stays valid if slit apertures perpendicular to the dispersion direction are used in the setup. This will result in having the resolution as described in Eq. (12) along one sensor axis, while the other axis stays unaffected by the filtering.

The result of Eq. (9) corresponds to the case for an extended grating. However, if we use a hologram consisting of a pixelated pattern of microgratings, additional complications arise at the transition from one grating to the next. Due to the limited spatial resolution of the imaging, a point in the image plane receives light from an extended patch in the object plane. As shown in Fig. 2, this patch might fall onto the transition between two microgratings. The point receives light from two different gratings and, therefore [Eq. (9)], with two different spectral distributions.

Fig. 2

Imaging a point close to the transition between two microgratings. Due to the limited spatial resolution, a spectral and spatial blur occurs.

OE_61_1_015106_f002.png

As a result, we have the weighted sum Γ of two spectral distributions. The weight is proportional to the corresponding (incoherent) point spread function of the imaging:

Eq. (13)

Γ(Δλ)=x0x1PSF(x)·Φ(Δλ,d1)dx+x1x2PSF(x)·Φ(Δλ,d2)dx

Eq. (14)

=Φ(Δλ,d1)·x0x1PSF(x)dx+Φ(Δλ,d2)·x1x2PSF(x)·dx,
where x0 and x2 correspond to the overall extension of the PSF and x1 denotes the position of the transition between the gratings d1 and d2. For aberration-free imaging, one can use the incoherent PSF, i.e., the Airy function.

This leads to a noteworthy broadening of the spectral response only, if the grating periods are considerably different and the gratings are small. For larger gratings (compared with the PSF extension in object space), the broadening is only visible at the position of the transitions.

4.

Experimental Verification

We realized an exemplary setup with nine spectral channels designed for a wavelength range between 510 and 700 nm. The DOE consists of a repeating stripe pattern (microgratings) as shown in Fig. 3. A block of nine microgratings is combined into the so-called macrograting.

Fig. 3

Schematic drawing of the DOE. A so-called macrograting, which consists of nine microgratings, is repeated along one spatial axis. Each micrograting defines the spectral range to be detected by the pixel corresponding to that micrograting.

OE_61_1_015106_f003.png

An optics simulation was performed using Zemax®. The numerical aperture was set to 0.005. For the simulation, a grating with a constant frequency of 0.425lines/μm was used, leading to a diffraction angle of 14.79 deg for a wavelength of 600 nm. For collimation, an achromatic lens with a focal length of 200 mm (Thorlabs AC254-200-A) was inserted, leading to a beam diameter of 2.00 mm, which defines the size of the spectral filtering aperture D2. The last lens, a 100-mm achromatic lens (Thorlabs AC254-100-A), is used to focus the light onto the image sensor. Based on the simulation, a spectral resolution of ~21 nm FWHM for the system is expected.

According to the Rayleigh criterion

Eq. (15)

ΔX=0.61·λNA,
the spatial resolution in the intermediate image plane is ΔX=73.2μm for λ=600nm.

The width of the microgratings is set to 70μm in our laboratory setup, which is close to the spatial resolution limit for the central wavelength. The DOE size of 10  mm×10  mm leads to 142 microgratings and 15 macrogratings.

A monochrome image sensor (Ximea MQ013MG-E2) with an active area of 6.9  mm×5.5  mm and a pixel size of 5.3  μm was used. The response of the used sensor is shown in Fig. 4.

Fig. 4

Response of the used Ximea camera.17

OE_61_1_015106_f004.png

The imaging system generating the intermediate image is set up by a 40-mm lens and a 100-mm lens. The aperture is placed in the object-sided focal plane of the second lens to generate image-sided telecentric imaging. The diameter of the aperture D1 is adjusted to fit the targeted NA of the HSI-system, such that the filtering aperture fits the beam diameter.

5.

Sensor Validation

To characterize the system, a USAF target (Thorlabs R1DS1P) was illuminated by a fiber-coupled broadband halogen lamp (Helmut Hund GmbH, FLQ 150) in combination with different bandpass filters. Figure 5(a) shows a measurement without any spectral filter. This and all other measurements shown in this paper consist of 50 averaged individual measurements to reduce noise. Due to the broadband illumination, the complete scene can be seen. The illumination spectrum is shown in Fig. 5(b) by the solid red line.

Fig. 5

(a) HSI measurement taken with a broadband illumination and (b) signal along the marked lines in (a) depicted together with the reference spectra.

OE_61_1_015106_f005.png

The visible fringes in the measurement [Fig. 5(a)] originate from the spectral intensity distribution of the light source, which decreases significantly with increasing wavelength. Figure 5(b) shows the signals along the marked lines in Fig. 5(a) as well as the spectrum of the light source captured by a spectrometer. All signals were normalized and show a similar behavior. The differences between the spectrum and the measurement are, on the one hand, caused by manufacturing irregularities of the gratings, resulting in different diffraction efficiencies for the gratings. On the other hand, the spectral resolution of the system is not high enough to resolve the small details of the illumination spectrum.

The smallest resolvable element of the USAF target has a spatial frequency of 28.50 lp/mm. Taking into account the magnification of the imaging system, the resolution is 87μm. This is close to the theoretical resolution of 85μm for 700 nm [Eq. (15)].

Figure 6 shows the same scene as shown in Fig. 5(a) but with a spectrally filtered illumination. The employed bandpass filter has a central wavelength of 620 nm and a bandwidth of 10 nm. In Fig. 6, only the areas where the microgratings match the wavelength of the illumination appear bright.

Fig. 6

(a) HSI measurement captured with an inserted bandpass filter with a central wavelength of 620 nm and a bandwidth of 10 nm. (b) The illumination spectrum.

OE_61_1_015106_f006.png

For the evaluation of the wavelength sensitivity, the bandpass filter was successively tilted, such that the center of mass (COM) of the spectrum changed from 623.0 to 622.5, 621.4, and 619.0 nm. The shift was observed with a spectrometer. The illumination spectra are shown in Fig. 7(a).

Fig. 7

(a) Illumination spectra for the different signals shown in (b) and (c). The example measurements in (b) and (c) were performed along the y axis at a single position inside the (b) green and (c) red marked areas in Fig. 6, respectively. The different curves were obtained by bandpass filtering of a halogen light source controlled by a commercial spectrometer.

OE_61_1_015106_f007.png

Figures 7(b) and 7(c) show typical signals acquired along the y axis of the sensor on an example position in the green and red marked areas given in Fig. 6(a), respectively. As can be seen, the peak location changes for the different wavelengths.

Figure 8 shows the variation of the COM for the different illumination spectra with respect to the spectrum with a COM of 623.0 nm. The threshold for the values considered for the calculation of the COM was set to 50% of the maximum of each evaluated column. In Fig. 6, the evaluated areas for Figs. 8(a) and 8(b) are marked in green and red, respectively. Each area consists of 200 columns along the x axis. As can be seen, the amount of displacement in both the red and green highlighted areas is overlaid by a periodic structure. The amplitude of this structure increases with stronger COM displacement. The origin of this effect is unclear and needs to be further researched. In addition, a different trend can be observed in the two plots. This could be caused by possible aberrations of the optical system or by the DOE itself. Both the periodic pattern and the different trend should be easily corrected by calibration.

Fig. 8

Change of the COM for 200 pixel columns inside the (a) green and (b) red marked areas in Fig. 6(a).

OE_61_1_015106_f008.png

Table 1 shows the average change of the COM as well as the standard deviation for the change inside the red and green areas, respectively.

Table 1

Average shift and standard deviation for the COM shift for different wavelength shifts inside the red and green marked areas in Fig. 6(a).

Wavelength (nm)Green areaRed area
Average (px)Standard deviation (px)Average (px)Standard deviation (px)
623.0 to 622.50.800.060.890.10
623.0 to 621.41.800.142.030.19
623.0 to 619.03.200.223.560.28

For the evaluation of the spectral resolution and for the validation of Eq. (12), the system was equipped with a grating that has a period of 2  μm over the whole field. The spectral resolution defining the minimal distinguishable distance between two wavelengths is described by the half-width of the signal under monochromatic illumination. The spectral resolution should not be confused with the detection of a spectral shift studied in the previous section. For the illumination of the system, a monochromator that allows a spectral tunable illumination from 450 to 550 nm was used. The half-width of the illumination spectrum is about 10 nm. The illumination was systematically tuned in 1 nm steps.

As shown in Fig. 9, the signal varies with illumination as expected. The half-width is 18  nm, which is our spectral resolution Δλ. Using Eq. (12), a theoretical lateral resolution ΔX of 55.5  μm in the intermediate image plane can be expected for a wavelength λ=500  nm. By looking at a USAF target with an illumination of 500 nm, a structure with 40.3 lp/mm can be resolved. This corresponds to a resolution of 62  μm in the intermediate image plane.

Fig. 9

Sensor response of a single pixel on the image sensor as a function of the illumination wavelength. A grating with a constant period over the whole field was used.

OE_61_1_015106_f009.png

6.

Conclusion

We have shown a new approach for hyperspectral snapshot imaging based on microgratings in an intermediate image plane. An individually manufactured DOE in combination with an aperture performs the spectral filtering. Compared with conventional Fabry–Perot-based mosaic filters, fabrication is simpler, and a stable filter response is automatically achieved. Most important is the possibility of choosing arbitrary spatiospectral patterns for a given application.

These advantages are achieved at the cost of a coupling of the spatial and spectral resolutions along the direction of the spectral separation. However, it should be mentioned that the spectral resolution is not always the relevant parameter. For some applications, e.g., chromatic confocal microscopy, the exact determination of the center of gravity of a broad spectral distribution is the relevant specification. For such applications, we could demonstrate measurements of spectral shifts as small as 0.5 nm for our proof-of-principle system covering a wavelength range from 510 to 700 nm with nine channels. This is analogous to human color vision, in which quite broad spectral channels are used and still excellent color discrimination in the yellow-green spectral range can be achieved. Therefore, the proposed method is especially suited for such applications in which broad and overlapping spectral channels can be employed or are advantageous.

We also want to note that for many applications some kind of restoration of spatial resolution might be achieved by postprocessing using common demosaicing algorithms. However, since the limit of spatial resolution in the proposed method is still coupled to spectral resolution, the achievable resolution enhancement probably will be strongly limited.

Acknowledgments

We thank the German Federal Ministry for Economics Affairs and Energy (BMWi) for financial support within the ZIM project “Mobimik” (Grant No. 16KN075722). All of the authors’ institutions received funding from the Federal Ministry of Economics Affairs and Energy (BMWi). Apart from these grants, this submission was prepared free of other conflicts of interest.

References

1. 

P. Janssen, “Sur la méthode qui permet de constater la matière protubérantielle sur tout le contour du disque solaire,” C. R. Acad. Sci., 68 713 –715 (1869). Google Scholar

2. 

N. Hagen and M. W. Kudenov, “Review of snapshot spectral imaging technologies,” Opt. Eng., 52 090901 (2013). https://doi.org/10.1117/1.OE.52.9.090901 Google Scholar

3. 

N. Hagen, “Snapshot advantage: a review of the light collection improvement for parallel high-dimensional measurement systems,” Opt. Eng., 51 (11), 111702 (2012). https://doi.org/10.1117/1.OE.51.11.111702 Google Scholar

4. 

B. Geelen, N. Tack and A. Lambrechts, “A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic,” Proc. SPIE, 8974 89740L (2014). https://doi.org/10.1117/12.2037607 PSISDG 0277-786X Google Scholar

5. 

P. Agrawal et al., “Characterization of VNIR hyperspectral sensors with monolithically integrated optical filters,” in IS and T Int. Symp. Electron. Imaging Sci. and Technol., 1 –7 (2016). Google Scholar

6. 

R. Hahn et al., “Detailed characterization of a hyperspectral snapshot imager for full-field chromatic confocal microscopy,” Proc. SPIE, 11352 213 –226 (2020). https://doi.org/10.1117/12.2556797 PSISDG 0277-786X Google Scholar

7. 

R. Hahn et al., “Detailed characterization of a mosaic based hyperspectral snapshot imager,” Opt. Eng., 59 (12), 125102 (2020). https://doi.org/10.1117/1.OE.59.12.125102 Google Scholar

8. 

R. Bacon, “The integral field spectrograph TIGER: results and prospects,” Tridimens. Opt. Spectrosc. Methods Astrophys., ASP Conf. Ser., 71 239 –249 (1994). https://doi.org/10.1017/S0252921100023058 Google Scholar

9. 

G. Courtes et al., “A new device for faint objects high resolution imagery and bidimensional spectrography,” Instrumentation for Ground-Based Optical Astronomy, 266 –274 Springer, New York (1988). Google Scholar

10. 

T. Okamoto and I. Yamaguchi, “Simultaneous acquisition of spectral image information,” Opt. Lett., 16 (16), 1277 –1279 (1991). https://doi.org/10.1364/OL.16.001277 OPLEDP 0146-9592 Google Scholar

11. 

C. Platias et al., “Snapshot multispectral and hyperspectral data processing for estimating food quality parameters,” in 9th Workshop on Hyperspectral Image and Signal Process.: Evol. in Remote Sens. (WHISPERS), 1 –4 (2018). https://doi.org/10.1109/WHISPERS.2018.8747009 Google Scholar

12. 

L. M. Dale et al., “Hyperspectral imaging applications in agriculture and agro-food product quality and safety control: a review,” Appl. Spectrosc. Rev., 48 (2), 142 –159 (2013). https://doi.org/10.1080/05704928.2012.705800 APSRBB 0570-4928 Google Scholar

13. 

T. Haist et al., “Kamerabasierte Erfassung von Vitalparametern,” Tech. Mess., 86 (7-8), 354 –361 (2019). https://doi.org/10.1515/teme-2019-0019 Google Scholar

14. 

E. L. Wisotzky et al., “Validation of two techniques for intraoperative hyperspectral human tissue determination,” Proc. SPIE, 10951 109511Z (2019). https://doi.org/10.1117/12.2512811 PSISDG 0277-786X Google Scholar

15. 

M. Taphanel and J. Beyerer, “Fast 3D in-line sensor for specular and diffuse surfaces combining the chromatic confocal and triangulation principle,” in IEEE Int. Instrum. and Meas. Technol. Conf. Proc., 1072 –1077 (2012). https://doi.org/10.1109/I2MTC.2012.6229116 Google Scholar

16. 

J. Goodman, Introduction to Fourier Optics, 3rd ed.McGraw-Hill(2016). Google Scholar

Biography

Robin Hahn is a PhD student at the Institut für Technische Optik at the University of Stuttgart. He received his master’s degree in 2016 from the University of Stuttgart in photonic engineering. His main research interests are in the field of optical 3D metrology, in particular interferometry and confocal microscopy. He is the former vice president of the SPIE Student Chapter of the University of Stuttgart.

Tobias Haist studied physics and received his PhD in engineering from the University of Stuttgart. Currently, he is leading the group 3D Surface Metrology at the Institut für Technische Optik, where he is working on new applications for spatial light modulators and 3-D measurement systems. His main research interests include optical and digital image processing, computer generated holography, and optical measurement systems.

Kristina Michel: Biography is not available.

Wolfgang Osten is a retired full professor at the University of Stuttgart and former director of the Institut für Technische Optik. His research work is focused on new concepts for machine vision by combining modern principles of optical metrology, sensor technology, and digital image processing. He is the fellow of OSA, SPIE, EOS, SEM, and senior member of IEEE. He is recipient of the Gabor Award of SPIE, the Kingslake Medal, the Vikram Award, and the Leith Medal of OSA.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Robin Hahn, Tobias Haist, Kristina Michel, and Wolfgang Osten "Diffraction-based hyperspectral snapshot imager," Optical Engineering 61(1), 015106 (27 January 2022). https://doi.org/10.1117/1.OE.61.1.015106
Received: 16 July 2021; Accepted: 5 January 2022; Published: 27 January 2022
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
KEYWORDS
Hyperspectral imaging

Imaging systems

Spectral resolution

Optical filters

Image sensors

Sensors

Image filtering

Back to Top