Recently, market demand for miniaturized 3D optical imaging modules has been remarkably increased since smart devices, wearable devices, or multifunctional imaging devices has attracted both customer and developer’s interest. However, regardless of the high technological maturity of 3D surface imaging techniques mentioned above, miniaturized optical key elements for 3D surface imaging is required to be packaged into compact imaging systems, such as multifunctional cameras in smartphones and 3D endoscopic catheter. In this section, previous works on optical MEMS devices for compact 3D surface imaging system, which are the stereoscopic vision, structured light, and ToF will be introduced. Recent researches on MEMS-enabled 3D stereoscopic imaging systems were focused on using a single image sensor rather than two identical cameras to reduce the overall size of the optical systems [27,28,29,30]. Hexagonal arrays of liquid crystal (LC) lens device operated by the applied voltage enabled the focus-tunable 3D endoscopic system using a single image sensor [27]. The upper patterned 7 hole-like ITO electrodes enabled smooth parabolic-like gradient electric field distribution to manage the phase profiles in each LC lens. The hexagonal array of LC lens could capture the object images with the different viewpoint on a single image sensor, which were used to reconstruct 3D images (Fig. 2a). Moreover, they reported 2D/3D tunable endoscopy imaging system using dual layer electrode LC lens [28]. The multi-functional LC lens (MFLC-lens) based endoscope was 2D/3D switchable as well as focus-tunable in both modes by controlling the voltage (Fig. 2b). Another single-imager based stereoscopic camera utilized parallel plate-rotating MEMS device by changing the beam path through the transparent parallel plate [29]. They fabricated electrothermal bimorph actuator and an anti-reflective optical plate was directly placed above the microstructure to generate the binocular disparities between subsequent images in a temporal division by changing the parallel plate rotation angle up to 37° in front of an endoscopic camera module, which was comparable to 100 μm baseline distance binocular cameras (Fig. 2c). In addition, they successfully demonstrated the anaglyph image and calculated disparity maps for 3D imaging by capturing two optical images at the relative positions. Another MEMS-enabled stereoscopic imaging system was microprism arrays (MPA) based stereo endoscopic camera [30]. The MPA with 24° of apex angle and symmetric arrangement, which was microfabricated by using conventional photolithography, thermal reflow, and polydimethylsiloxane (PDMS) replication, splits light rays from an object into two stereo images when placed in front of a single camera module (Fig. 2d). Measured distances of the object were calculated and compared with the actual distance by comparing the two stereo images from refraction of symmetric MPA.

Fig. 2 Single image sensor based optical systems for 3D stereoscopic imaging; a hexagonal LC lens arrays for 3D endoscopy and 3D reconstruction result [27]. b Dual layer electrode LC lens arrays for 2D/3D tunable endoscopy and their 2D/3D mode imaging results [28]. c Electrothermal MEMS parallel plate rotation device and anaglyph image, calculated disparity map of the slanted object with textures [29]. d Microprism arrays based stereo endoscopic camera and stereoscopic imaging result [30] Full size image

The structured light method with the digital micromirror device (DMD), which can selectively reflect the incoming light ray and generates structured light patterns, enabled various 3D imaging researches with high-speed performances. However, the overall size of the DMD system is considerably large to be assembled in various miniaturized optical devices, so that the recent researches on structured light generation for 3D surface imaging utilized optical MEMS devices for compact configuration. Previous works on 3D surface imaging using structured light with optical MEMS devices mainly divide into the utilization of actuating reflective MEMS mirror [31,32,33,34] and diffraction generation from laser transmission through grating micro-/nanostructures [35,36,37,38]. Liquid immersed MEMS mirror was demonstrated to enlarge the scanning FOV for 3D surface imaging from 90° to 150° by “Snell’s window” effect (Fig. 3a) [31]. Fabricated 1D scanning MEMS mirror generates a structured light pattern by a combination with a cylindrical lens to convert the laser spot into a laser line stripe. In addition, they reconstructed depth map by illuminating structured light from the designed projector toward the objects positioned at 64° to 128°. The projector module can only capture the stationary scenes because the liquid immersed MEMS actuator caused heat transfer inside the liquid and turbulence when operated with high speed. In addition, line array projector module by combining a single-axis torsional MEMS mirror with a diffractive microstructure was demonstrated (Fig. 1b) [32, 33]. The deformation of the projected line array pattern, which was generated with the scanning of the diffractive dot array patterns in 25-kHz frequency, was captured by the CMOS camera, was calculated to estimate the depth profile of the object, and found in accordance with the geometrical size of the target object. Besides, variable structured illumination projector using a laser-modulated 2D Lissajous scanning MEMS mirror were reported (Fig. 1c) [34]. The pattern density of the projected structured light pattern was controlled by the modulation of a laser beam at the least common multiple of the scanning frequencies, while the MEMS mirror was scanned at a frequency with the greatest common divisor (GCD) greater than 1. The variable structured illumination was performed by changing GCD of scanning frequencies and the phase of operating signals.

Fig. 3 Structured light pattern generation system by scanning the MEMS mirror for 3D surface imaging; a wide-angle structured light generation with 1D MEMS mirror immersed in liquid and its 3D imaging results with the pattern generation FOV over 90° [31]. b Line array projector consisted of a 1D scanning MEMS mirror and a diffractive microstructure and the estimation of the depth profile of the object by calculating the line deformation [32, 33]. c Variable structured illumination using Lissajous scanning MEMS mirror and optical patterns from the projector module with different GCD and phase [34] Full size image

Another researches using transmitting diffraction grating for structured light pattern generation have also been conducted because of their compact optical configurations without MEMS mirror and its actuating circuit. A binocular 3D imaging system utilized the conventional stereoscopic camera with the 64 × 64 Dammann grating for laser spot array generation [35, 36]. Dammann array projector using laser diode (LD), collimating lens, Dammann grating, and objective lens with simple configuration, was placed between binocular cameras to provide laser spot arrays for stereo matching of two cameras (Fig. 3a). The overall system was less than 14 cm and weighs less than 170 g. Another structured light projector could generate dot array patterns by combining a designed transmission diffractive optical element (DOE) with two types of light sources: the edge emitting laser (EEL) and the patterned vertical cavity semiconductor emission laser (VCSEL) array (Fig. 4b) [37]. E-beam lithography and nano-imprint lithography enabled the fabrication of the designed DOE with the Gerchberg–Saxton algorithm to generate the phase distribution. The fabricated DOE, placed in front of the collimated light source with EEL or patterned VCSEL arrays, produced irregular random or regular structured light patterns, respectively. Another structured light projector using multifunctional binary DOE could generate line pattern arrays with high contrast and uniformity [38]. Multiple-stripe patterns were generated with high diffraction efficiency by designing the binary surface relief, which combines functions of a diffractive lens, Gaussian-to-tophat beam shaper, and Dammann grating (Fig. 4c). The designed multifunctional DOE, fabricated by E-beam lithography, showed diffraction efficiencies up to 88% with 20° fanout angles.

Fig. 4 Structured light system by diffraction generation from laser transmission through grating structures; a binocular 3D imaging system using a structured light projector with a Dammann grating and captured diffraction patterns by a designed Dammann grating (inset) [35, 36]. b Structured light projector with a DOE designed by the Gerchberg–Saxton algorithm and patterned VCSEL arrays. Their projected dot array pattern is shown at the bottom line with fabricated DOE (inset) [37]. c Multifunctional binary DOE in combination with diffractive lens, Gaussian-to-tophat beam shaper, and Dammann grating. Their projected tophat line array pattern is shown at the bottom line with fabricated DOE (inset) [38] Full size image

MEMS fabrication techniques also enabled the miniaturized and low-cost ToF based 3D imaging systems [39,40,41]. A LIDAR system with an optical 256 × 64-pixel ToF sensor and MEMS laser scanning device was introduced [39]. Emitted pulsed signals from three LDs traveled through the collimating lenses and reflected by the two-axis MEMS scanner toward the target scenes with FOV divided into three scanning regions (Fig. 5a). Reflected pulsed light from the target objects were then received by designed single-photon image CMOS sensor with 256 × 64 pixels to calculate the depth profile. The authors could precisely measure the distance up to 20 m with maximum error of 13.5 cm. Another MEMS-enabled ToF researches using micromachined electro-absorptive optical modulator was reported [18, 40, 41]. The optical modulator was designed as a multi-layer stacked structure of diffractive mirrors and electro-absorptive layers, to maximize the magnitude of optical modulation. The fabricated device modulates the IR image reflected from the target object to extract the phase delay of the traveled IR light. The transmittance difference generated by applying the voltages to the device was 51.8%, which was sufficiently large amount of IR light modulation to obtain enough IR intensity and good signal-to-noise ratio. After characterization, optical modulator was placed between the beam splitter and CMOS image sensors to identify the phase delay of incoming IR lights of each pixel for depth calculation and RGB image matching (Fig. 5b).