Open Access
21 October 2021 Flying a helicopter with the HoloLens as head-mounted display
Author Affiliations +
Abstract

We describe the flight testing and the integration process of the Microsoft HoloLens 2 as head-mounted display (HMD) with DLR’s research helicopter. In the previous work, the HoloLens was integrated into a helicopter simulator. Now, while migrating the HoloLens into a real helicopter, the main challenge was the head tracking of the HoloLens, because it is not designed to operate on moving vehicles. Therefore, the internal head tracking is operated in a limited rotation-only mode, and resulting drift errors are compensated for with an external tracker, several of which have been tested in advance. The fusion is done with a Kalman filter, which contains a non-linear weighting. Internal tracking errors of the HoloLens caused by vehicle accelerations are mitigated with a system identification approach. For calibration, the virtual world is manually aligned using the helicopter’s noseboom. The external head tracker (EHT) is largely automatically calibrated using an optimization approach and therefore, works for all trackers and regardless of its mounting positions on vehicle and head. Most of the pretests were carried out in a car, which indicates the flexibility in terms of vehicle type. The flight tests have shown that the overall quality of this HMD solution is very good. The conformal holograms are almost jitter-free, there is no latency, and errors of lower frequencies are identical with the performance that the EHT can provide, which in combination greatly improves immersion. Profiting from almost all features of the HoloLens 2 is a major advantage, especially for rapid research and development.

1.

Introduction

Optical see-through head-mounted displays (HMD) can provide information in a way that no other display can.1 This unique capability makes it possible to create artificial environments,2 visualize sensor data and sensor images,3 and reduce size, weight, and power of avionic systems4 and has completely changed the avionics display applications in military airborne vehicles,58 which is likely to happen in large parts in civil aviation as well. If correctly designed, they perfectly align with the goal of avionics to increase situational awareness9 and to prevent important cues from being missed due to informational overload. Also aircraft simulators and training7,1013 can be enhanced with HMDs. Especially in combination with a head tracker, world-fixed symbols can significantly improve the performance of pilots.1419

Various HMD systems have been developed over the past few decades. Those systems must meet many safety criteria in order to be used for helicopters, which requires a high development and certification effort. Due to the high costs, those systems are currently mainly in use by military operators. The high adaptability offers a much broader field of application than military use. Especially, the use in commercial offshore helicopters during oversea operation can increase flight safety and expand the environmental conditions where helicopter operations are possible. During routine offshore operations, helicopter pilots are often faced with harsh environmental conditions such as sea fog, precipitation, and turbulence. Flights in such degraded visual environments (DVE) can cause a degradation in handling qualities as well as a significant increase of workload. Those factors can lead to a spatial disorientation, a loss of situational awareness and in the worst case to a controlled flight into terrain. It has been shown in many previous works that HMDs can evidently reduce pilots’ workload and increase their situational awareness.2026 Especially in environments with outside visual references, HMDs provide additional visual cues.

To be able to apply these safety relevant features to the commercial operation, HMD systems need to be affordable for non-military helicopter operators and they need to be retrofittable in typical offshore helicopter. Although military HMDs were mainly developed for a specific aircraft type, modern commercial off-the-shelf (COTS) augmented reality (AR) systems could be used in a wider range of vehicle types. In combination with suitable support systems, such as head tracking and vehicle state data sensor units, they could be used as low-cost HMDs. Latest AR glasses have multiple features integrated, for example, on-board computation units, various connection interfaces, and reliable power supplies. This is beneficial for the integration in an aircraft environment, where each modification creates much effort.

Although COTS AR glasses have the advantages of a supplied software environment, the functionality is limited to designated use cases. In the case of the Microsoft HoloLens 2, the provided operation system and software have been developed for the use in static remote work environments. To be able to use the system in a three-dimensional vibrating dynamic environment, advanced system understanding of the environment and the AR glasses is necessary. This knowledge is restrained by the manufacturer. In addition, support systems are required to provide data on the current position and orientation of the AR glasses in the world.

Although the first vehicles in the automotive sector are equipped with the AR head-up displays as standard, there is also great industrial interest in research on the application of future AR systems of this kind.2732 This includes the rapid development and testing of the systems in the moving vehicle in order to make a basis for larger decisions. In this way, the advantages compared to other display types or different symbologies can be examined.3335 In addition, there is the forecast that AR glasses will replace smartphones, according to which the smartphone manufacturers align their corporate strategy.36 Then practically everyone would always have AR glasses like the HoloLens with them. This means that HMD assistance systems could be used very widely in vehicles such as cars, bicycles, ships,37 planes, helicopters, or urban air taxis.

After the HoloLens proved to be an excellent basis for HMD research in the simulator, our questions regarding an integration into a real helicopter was whether it is possible to use such COTS AR glasses as a HMD, how the quality is, and what the possibilities and limitations are. These questions are answered here and in addition, the validated methods of that integration are provided.

Note that the term hologram is used here to describe the artificial light objects or artificial holograms that the HoloLens user perceives at the end.

2.

Previous Work

The Institute of Flight Systems at German Aerospace Center (DLR) is researching the use of HMDs as part of the projects Helicopter Flight Safety in Maritime Environments (HELMA) and Helicopter Deck Landing Assistance (HEDELA). Both projects aim to increase flight safety and operational availability in offshore environment through the use and evaluation of visual and guidance assistance systems. Although the main field of operation in the HELMA project focuses on offshore wind farms, the HEDELA project focuses on helicopter ship deck landing. The projects are carried out in cooperation with the Flight Service of the German Federal Police, which operate in a broad field of offshore activities such as rescue, reconnaissance, and border patrol.

After extensive investigation and testing, the Microsoft HoloLens was selected as basis for further development in the projects. As first step, the HoloLens was integrated into DLR’s Air Vehicle Simulator (AVES). The AVES is a versatile research flight simulation facility providing a 6 degree of freedom hexapod motion system, which enables the use of multiple cockpit layouts through a roll-on-roll-off system.38 The Airbus EC135 cockpit was used in the development work with the HoloLens. This cockpit has equal dimension and a similar instrumentation as DLR’s research helicopter Active Control Technology/Flying Helicopter Simulation (ACT/FHS), which is described in Sec. 3.

The simulator uses the real time helicopter flight model HeliWorX.39,40 All simulator applications are based on the in-house developed real-time framework 2Simulate.41,42 The highly customizable and self-supported distributed simulator architecture enables uncomplicated integration of new systems into the simulator.

The HoloLens is connected to the simulator network via Wi-Fi and a firewall, in order to receive state and other data from the helicopter. It took two steps to align the holograms with the simulator’s world. First, the virtual 3D world of all holograms, analogous to the dome of the simulator, was projected onto a sphere. Second, this virtual dome of the holograms had to be placed and aligned exactly on the dome of the simulator in a calibration process. To remove rainbow effects, jittering and further effects during head movements various filters were integrated to stabilize the hologram. A symbology concept and LIDAR visualization was developed for the HMD.11,43 The basic symbology is shown in Figs. 1 and 2 through the HoloLens in the AVES.

Fig. 1

Basic symbology.

OE_60_10_103103_f001.png

Fig. 2

Photo through the HoloLens in the AVES.

OE_60_10_103103_f002.png

The HMD described above was tested and evaluated by 10 pilots during two simulator campaigns.21,44 The piloted simulator campaigns were conducted in the Alpha Ventus wind farm as part of the maritime environment in the AVES.45 As the results from the piloted simulator campaigns in November 2018 and December 2019 suggest, the goals of the HELMA and HEDELA projects can be archived with an HMD. Analysis of subjective and objective data from the studies show that workload can be reduced and situational awareness increased using an HMD in offshore environments. Pilots commented that they felt an increase in their perceived safety and that such a system is a benefit of comfort in good visual environments and a benefit of safety in DVE. Most of the advantages were determined in the subjective range. The objective analysis of flight data recordings shows no significant differences in handling behavior or mission performance such as total mission time, average speed, or variation of flight paths.

3.

System Setup

The generalized setup of such a system is shown in Fig. 3. A user wearing the HoloLens is on the vehicle. In general, external trackers consist of two parts: a target that is tracked and a base that locates the target. The target is attached to the HoloLens and the base to the vehicle. A global navigation satellite system and inertial navigation system (GNSS/INS) provides state data of the vehicle, which are mainly position, orientation, and speed. To read and process vehicle and tracker data, a computer is usually needed. A computer is also required to route the Wi-Fi and, if desired, to act as a firewall to isolate the Wi-Fi.

Fig. 3

General system setup.

OE_60_10_103103_f003.png

Fig. 4

DLR’s research helicopter ACT/FHS.

OE_60_10_103103_f004.png

Fig. 5

The experimental system in the ACT/FHS.

OE_60_10_103103_f005.png

TrackIR 5 was used as an external head tracker (EHT) in both helicopter and car. The infrared LEDs of the tracker were connected to the HoloLens with 3D printed attachment pieces and supplied with power via the HoloLens USB socket to keep it wireless. The car setup is kept simple and uses a laptop to read and process the TrackIR camera. A smartphone is used as GNSS/INS together with an interface application programmed for it. Although this is sufficient for rapid testing and development in the car, there are some issues with this solution implemented on an iPhone 6s, which was used as a smartphone in the car experiments. These are mainly the incorrect pitch and roll deflections that occur during vehicle accelerations, the low update rate of the global positioning system (GPS) with 1 Hz and the errors in GPS altitude. To the helicopter, this GNSS/INS data are available with better quality through the existing basic instrumentation, an satellite-based augmentation system, and a sophisticated sensor fusion. Usually, there is also dedicated vehicle-specific data, such as the engine limits for helicopters. Just as a car was used for the pretests, the system works for any other vehicle type, making it a flexible general solution.

The HoloLens 2 from Microsoft was used as HMD for the helicopter. The HoloLens 1 has a smaller field of view (FOV) and less computing power46 but is also fully functional and was used for early pretests with the car. However, it is difficult to wear a headset comfortably over the HoloLens 1 because of the thick frame over the ear, and the headset is required in the helicopter for board and radio communication. A bluetooth keyboard is used as input device to control the settings in these system tests of the HoloLens-2-HMD. Unity3D is used as the development environment for the HoloLens 2. The developments of this paper are based on the previous work, in which the HoloLens was integrated as HMD into a simulator, as stated in Sec. 2.

Tests were carried out to check whether the brightness and maximum display distance of the HoloLens are sufficient. The holograms are difficult to see in strong sunlight and cannot be seen if you look directly into the Sun. Regardless of an HMD, the pilot (or more generally the driver) would often use the Sun visor of the helmet or sunglasses in these cases. Similarly, car tint film with 20% light transmission was used for the HoloLens visor. Thus holograms are still clearly visible even in strong sunlight. This film can always be used, especially because when flying during the day it is generally very bright. Alternatively, it can also be removed in weather conditions with less brightness. In the dark, on the other hand, the hologram light blocks the dark adaptation of the eye so that the environment, which would be weakly visible without the HoloLens, can no longer be recognized in the worst case. In addition to light-dark adaptations of the eyes, limitations of the maximum hologram distance can also be problematic with AR glasses. But that is not the case with the HoloLens. Holograms can be placed in infinity without any accommodation problems or other perceptible inconvenience to the user. With the horizon as an example, there is no noticeable difference between the distance of artificial and real line.

Recording the view through the glasses from a users perspective is very important, especially for testing and developing head tracking solutions. In this way, errors between the virtual and real world can then be investigated and rectified. The HoloLens 2 has decent quality internal video recording when its internal tracking is fully available. This recording is a fake because holograms are subsequently rendered over the video. Therefore, it looks different from light generated by the optics. A bigger problem with the internal recording, however, is that it does not work in the HoloLens’ inertial measurement unit (IMU)-only tracking mode, as the holograms are then incorrectly positioned with an offset and also jitter a lot. Note that IMU-only tracking is the default mode for these tests on moving vehicles, as explained in Sec. 5. Therefore, filming with an external camera was chosen as the approach. However, you cannot film without at least one eye being recognized by the HoloLens. Probably because the visor can be raised, there is a feature whereby the hologram generation of the optics is only active when an eye is recognized or tracked. Therefore, a simple solution is to record with a smartphone while using the HoloLens normally. Smartphones are thin enough to be positioned between the eye and the visor. As an alternative to the human eye, the optics can also be activated with an artificial eye. Another problem is, if the exposure time of the camera is set too short, the holograms are not visible in the video. This usually occurs when filming with direct sunlight.

The final experiments were conducted with DLR’s research helicopter ACT/FHS, which is shown in Fig. 4. It is a highly modified Airbus (former Eurocopter) EC135 with a unique fly-by-wire/fly-by-light system fitted instead of the standard control signal transmission.47 With an experimental computer (EC) system (Fig. 5) and a safety concept consisting of a safety pilot (SP) and an experimental pilot (EP), this allows researchers to test new sensor systems, software, and control systems. The system design allows the helicopter to be used as an in-flight or airborne simulator. The systems dynamics can be manipulated so the pilot has the impression of flying a different helicopter. The experimental system consists of multiple computers used as data management, recording, or experimental units. For own applications, a Linux-based system, the EC, or a Microsoft Windows-based system, the experimental co-computer can be used. Both feature reliable hardware and multiple standard interfaces such as Ethernet and USB. Two displays in front of the EP and the flight test engineer can be used as visual output.

4.

Head Tracking Overview

In order to use the HoloLens as HMD in a helicopter, head tracking was the main problem as it is not designed to operate on moving vehicles. The head tracking is required to display holograms at the correct place in the world. The output is the head pose, which consists of the position and rotation of the head in three-dimensional space. In practice, the rotation of the head is particularly important. The position, on the other hand, can often be neglected because the error is transferred one to one to the world position of the hologram and the range of head movement in the vehicle is small. But the closer the holograms are to the viewer, the more important the position becomes, which is usually the case with holograms that are attached in the vehicle.

An overview to the head tracking with its signal flows, decisions, problems, and solutions is shown in Fig. 6. The upper part, which is surrounded by a magenta box, is about the internal head tracking (IHT) of the HoloLens. The lower green part, on the other hand, is about the external tracking. Finally, the fusion of both, internal and external tracking, is illustrated on the right, in blue. All individual blocks are discussed in detail in the following sections. The general idea is to use the IMU-based rotation of the internal HoloLens tracking in combination with an external tracker as reference against the internal drift. The proposed approach to head tracking on a moving platform is designed around the HoloLens but is not tied to any particular vehicle or type of vehicle. The frequently used terms internal and external tracking describe whether it is part of the HoloLens or not.

Fig. 6

Head tracking overview for the HoloLens in moving vehicles.

OE_60_10_103103_f006.png

To apply external or modified head tracking to the HoloLens, a workaround is used. The camera’s pose in Unity3D is always set to the IHT measurement and cannot be changed in position or rotation. Therefore, a parent transform is used for the camera, which is always set to the inverse transformation of the camera. This means that the resulting camera is always at zero and can be set to the pose of a custom tracker with another parent.

5.

Internal Head Tracking

HoloLens internal inside-out head tracking uses four environmental cameras and an IMU. This full IHT does not work in moving vehicles as it is not designed for it. Tests in moving cars and the flying helicopter showed frequent jumps and drift in tracking as error behavior, and in the worst case total failure, visible through rapid chaotic rotations of all holograms.

By covering the environmental cameras, the HoloLens goes into a rotational only tracking state that is based on the IMU. This feature is actually intended to bridge brief, accidental obscurations but works just as well for permanent covering. In this case, without the visual odometry, the tracking is more robust and can be used in moving vehicles, but there are some drawbacks. In addition to the lack of positional tracking, these are long-term responses in form of drift errors. Short-term errors, on the other hand, such as hologram jitter as a major immersion factor remain barely perceptible. In general, apart from drift errors, the orientation tracking is very good in this IMU-only state.

Also note that this IMU-only tracking is done relative to earth instead of the helicopter. The important advantage here is that high-frequency state errors of the vehicle, which inevitably arise due to latency but also due to measurement errors, do not propagate to the world-fixed holograms. The world-fixed holograms are therefore very immersive, stable, and free of jitter.

The mentioned drift errors of the IMU-only mode are accumulated over time by integrating the rotation rates measured by the gyroscopes. Drift is usually addressed with a reference. For pitch and roll, the gravity direction of an accelerometer is commonly used as reference. A magnetometer is often used for heading, but although the HoloLens has one, the heading still drifts, possibly due to its electromagnetic emissions. From the viewer’s perspective, the north of the hologram world does not stay at real north but drifts to the left or right. A typical drift speed would be 1 deg per 3 s, but of course that varies. To fix this heading drift problem, an EHT is used, as described in Sec. 6. For rapid testing without external dependencies, the user can also compensate the heading drift by rotating the virtual world with manual inputs.

The drift of pitch and roll, on the other hand, is referenced with gravity. If the x, y, z components of the gravity vector are known in body space, then the pitch and roll angles of this body can be determined trigonometrically. However, the accelerometers output gravity only in the stationary case. In practice, body movement is usually superimposed on gravity. The separation usually occurs due to the lower frequency of gravity. The methods used by Microsoft for the HoloLens are not published, but based on testing and experience, there is no drift in pitch or roll when moving the head. However, this does not work properly with additional vehicle accelerations. The HoloLens incorrectly interprets the accelerations as components of gravity. As a result, the virtual hologram world is slowly rotated away from its correct orientation. This behavior is referred to herein as “gravity swim.”

Recorded examples of gravity swimming with data from car accelerations are shown in Fig. 7. The causal sequence is (1) vehicle accelerates, (2) pitch rotation of HoloLens gravity estimation (gray line), and (3) HoloLens head tracking slowly following its gravity estimation (blue line). From the user’s perspective, the artificial horizon drops about 5 deg, a few seconds after accelerating from 0 to 40  km/h. To acquire the data, videos were recorded while looking through the HoloLens at the artificial horizon. The time sequences of the head tracking errors (blue lines in Fig. 7) were measured from video frames as difference between artificial and real horizon. The vehicle accelerations (source of gray lines) are known due to the iPhone application representing a GNSS/INS.

Fig. 7

Gravity swim and its system identification with four data sets that were acquired with a car. The car was accelerated from a standstill to 40  km/h and then braked back to a standstill, represented by the gray line.

OE_60_10_103103_f007.png

To fix or mitigate this gravity swim error, a system identification was carried out. The basic idea is to reverse the incorrect gravity swim rotation with its identified estimation. Results of that estimation are illustrated in Fig. 7 as red lines. The identified discrete-time transfer function with a sample time of 1/60  s is

Eq. (1)

Y(z)U(z)=1.1055×105z110.997168454z10.999995202z2+0.9971732907z3.

Although this system identification is one dimensional, gravity swim is a two-dimensional problem and must be implemented accordingly. Testing this system identification approach in the car showed the expected error mitigation for longitudinal vehicle acceleration but not in turns. For 90 deg car turns, the gravity swim error was only about half of its estimation. Accelerations which change the direction of the vehicle, instead of the magnitude, have a weaker effect on the gravity swim error. The cause is not fully understood at this point. A two-dimensional data recording, to improve this system identification, is still pending. However, a straightforward scalar reduction of the acceleration components that change the direction of the vehicle already improve the result. Therefore, to separate the inertial acceleration into magnitude and direction, it is formulated in flight path space (k)

Eq. (2)

(dV_Kdt)gk=(dV_dt)kk+Ω_kgk×VKk=[V˙K00]+[χ˙sinγγ˙χ˙cosγ]×[VK00]=[V˙Kχ˙VKcosγγ˙VK],
where VK is the speed, χ is the heading, and γ is the pitch angle of the flight path. Here χ˙ is scaled down to 40% in the car and to 70% in the helicopter. This values come from testing. The resulting acceleration is then transformed into the north-east-down space so that the gravity swim estimate can be applied to the north and east axes as a counter rotation.

Finally, on the subject of gravity swim, it should be noted that the error is not completely eliminated with the system identification approach presented here but only reduced. In practice, the reduction of head tracking errors in pitch and roll is around 80% for longitudinal vehicle acceleration and worse for turns. The remaining error is dealt with by an EHT as described in Sec. 6.

A different but important HoloLens internal behavior is the postrendering image warp.48 After rendering a scene for a particular viewpoint, its associated head pose is already outdated due to the required computing time. Therefore, the head pose is measured again after rendering and with it the image is re-rendered according to the latest viewpoint change. Additionally, instead of using head poses directly, predictions of them are often used.

The HoloLens’ postrendering image warp is one of the main reasons why the holograms are so stable in the room without jittering. It thus makes a great contribution to immersion. As a side note, it does not work for head-fixed holograms, they will experience jitter caused by this warping. The warping is always based on the internal tracking and always active. Also no user interaction is possible. Therefore, it has a huge impact on the structure of head tracking solutions for the HoloLens on a moving platform. More specifically, it excludes all options that focus on an external tracker while ignoring internal tracking. Even an ideal perfect EHT without any errors or latency would have a lot of jitter due to the warping, which is, as pointed out, always based on the internal tracker.

6.

External Head Tracking

The IHT of the HoloLens is incorrect on moving platforms, as explained in Sec. 5. In order to correct this drift errors, an EHT is used in a fusion with the internal tracking. Various external outside-in head tracking systems were tested for their usability.

The alternative is to use the EHT alone and discard the IHT. This EHT-only mode is implemented and has been investigated. The main problem with this tracking mode is the postrendering image warp generating a lot of jitter, as explained at the end of Sec. 5. This image warp-based jitter is superimposed by measurement jitter and latency-based jitter. Hence, some improvements can be made, such as using a better EHT and minimizing latency. Sources of latency are the measuring pipeline, data processing, Wi-Fi, and the update interval of the HoloLens. For this, it is important to operate the HoloLens with 60 frames per second. To fix the jitter of the remaining latency, predictive filters were tested, including a Kalman filter and a long-short-term memory (LSTM) network. The LSTM network in particular was very promising because it was able to learn typical head movements. However, they were not implemented because they only improve the jitter of the latency but not the jitter of the postrendering image warp, which was the bigger problem.

6.1.

Tracking Technologies

There are mainly two different technologies for outside-in EHTs, optical and electromagnetic trackers. Most optical trackers consist of two to four system components: (1) one or multiple cameras, (2) an image processing unit, (3) optional markers to be mounted on the tracking target, and (4) an optional IR light source. Those systems use image processing feature detecting and tracking algorithms. The features can be active targets such as LED arrays or passive marker arrays with reflective materials, which also need an additional light source. In more recent approaches, the features are learned using computer vision methods such as deep learning directly from the IR images using large databases.49,50 Both the LED arrays and the passive markers use IR light with a typical wavelength of 850 nm. Manufacturers for those systems are, for example, Advanced Realtime Tracking (ART), OptiTrack, or NaturalPoint. Some systems, such as Smart Eye, do not need additional markers and use facial features for their tracking algorithms. Another method of optical tracking is used by the HTC Lighthouse tracking system, which uses fast rotating IR light sources and a device mounted on the tracking target with multiple photosensors to catch the IR rays.

The advantages of optical EHTs are the lightweight and simple system setup, a popular use in the VR/AR business, and a reasonable price. The minimum system setup consists of one camera with an IR light source, a reflective marker on the target, and a computer as image processing unit. Since the technology is widely used with consumer VR products, many open-source applications even for the use of a standard PC webcam are available. The angular tracking range is limited by the horizontal and vertical FOV of the cameras. The distance can be up to 20 m with active markers. Since passive markers reflect the light coming from an additional IR light source, the tracking distance is much less due to the intensity loss by traveling the double way. The accuracy of the tracking solution is depended on the used hardware and software. The minimal tracking resolution is dependent on the resolution of the photosensor in the used cameras. Multiple cameras can create a wider FOV. With active markers and bright LEDs, the tracking distance can be increased. Optical filters, such as IR filter, are used to reduce disturbances by unwanted light sources. Applications using artificial intelligence approaches and predictive pose estimation can improve the tracking solution. A disadvantage of optical trackers is the need of a clear line of sight (LOS) between camera and target and disturbances through unwanted additional light sources. Brightness coming from the Sun can blind the cameras or confuse the tracking algorithm, even if it is indirect light. A confusion of the tracking algorithm can also come through unwanted reflections for example on glasses or mirrors. Another disadvantage might be the size of the target markers, which can be a bulky array to enable a precise recognition from multiple points of view.

Electromagnetic EHTs consist of three main components. A source unit creates an electromagnetic field, in which a sensor mounted on the tracking target can determine its pose by measuring the direction and intensity of the magnetic field lines. The calculation of the pose is done by an additional processing unit. The advantages of electromagnetic trackers lie in the independence to external light sources and the small size of the target mounted sensor. Furthermore, no direct LOS between the source and the sensor is necessary, and a tracking through non-transparent materials is possible. Disadvantages are the sensitivity of the electromagnetic field and the sensor to ferrit materials and current-carrying wires. Those disturb the trackers electromagnetic field and the tracking result becomes unreliable. There are approaches such as the Polhemus fly true technology to identify and eliminate the disturbances that create additional latency.51

6.2.

Tested External Head Trackers

For the integration into the ACT/FHS, various EHTs with the optical and electromagnetic technology were tested. The systems are described briefly and a comparison and a decision regarding the application in the trials are given. A summary of their properties is listed in Table 1.

Table 1

Comparison of the tested EHTs.

NameLatencyUpdate rateWeightAdvantages/disadvantages
ART SMARTTRACK3 (optical)525 to 8 ms150 to 240 Hz1.5 kg+ Out-of-the-box functionality
+ SDK with many high-level features
+ No camera or room setup calibration necessary
− Highweight of the unit
HTC Vive (optical)5347 msn.a.0.5 kg+ Low cost and easy availability
+ Open-source applications
+ Many forums for troubleshooting
− Base stations pause tracking when sense motion
Polhemus VIPER (eletromagnetic)511 to 3 ms240 to 960 Hz1 kg+ Independent to light disturbances
+ Very high update frequency
+ Very high precision
− Sensitive to Ferrit and current-carrying wires
NaturalPoint TrackIR 5 (optical)549 ms120 Hz0.1 kg+ Low cost and easy availability
+ Very lightweight and straightforward system
+ No camera or room setup calibration necessary
− Small FOV and lower accuracy to the edges
− Inconveniences in the SDK
− Highly susceptible to interference from sunlight

The SMARTTRACK3 as shown in Fig. 8(a) is a complete tracking unit consisting of two cameras, an 850-nm IR light source and a processing unit build into a single unit. It can be used with active or passive tracking targets. According to the published data sheet,52 the unit can track targets with an image sensor with 1280×1024  pixels at an update rate of 150 Hz within a vertical FOV of 121 deg, a horizontal FOV of 93 deg, and a distance of 3 m. With reduced sensor resolution, an update rate of 240 Hz is possible. The unit is powered via power of Ethernet and weights about 1.5 kg. A software named Dtrack and a software development kit (SDK) is provided.

Fig. 8

Tested EHTs with their (1) bases and (2) targets.

OE_60_10_103103_f008.png

The HTC Vive (former lighthouse) setup, shown in Fig. 8(b), consists of (1) two SteamVR base stations and (2) one tracker unit to be mounted on the tracking target. Within the base stations, two quickly rotating units emit IR light, which is captured by 16 photosensors built into the tracker unit. A calibration of the room setup of the base stations is necessary, which requires either Vive VR glasses or a Vive controller. The system requires the gaming software Steam with the additional application SteamVR installed. An SDK named OpenVR is available that enables access to the tracking data from own applications. The latency has been tested to be at an average 47 ms.53

The Polhemus VIPER as shown in Fig. 8(c) is an electromagnetic tracker consisting of (1) an electromagnetic field generator unit, (2) one or multiple sensors, and (3) a processing unit. The sensor measures the magnetic flux within the magnetic field created by the field generator unit. From the sensor measurements, the processing unit can calculate the position of the sensor within the known magnetic field. According to the data sheet,51 the static accuracy is translational at 0.38 mm and rotational at 0.1 deg with a maximum update rate of 960 Hz and a latency of 1 to 3 ms.

The NaturalPoint TrackIR as shown in Fig. 8(d) is an optical tracking system consisting of (1) a camera unit with an integrated IR light source and (2) a marker unit, which can be active with three LEDs or passive with reflective tape. According to the specifications,54 the camera can track a single target at a resolution of 640×480  pixels, an update rate of 120 Hz, and a latency of 9 ms within a horizontal and vertical FOV of 51.7 deg. On request, NaturalPoint provides an SDK to access the raw and filtered tracking data in own applications.

6.3.

Requirements and Selection

The above presented EHTs were tested with the main requirement to be used in the ACT/FHS helicopter. Derived from that are the requirements that the device should be: (a) able to operate in a moving environment with changing environmental (e.g., light) conditions; (b) compatible with the HoloLens hardware and its electromagnetic emissions; (c) airworthiness certifiable; (d) as light and small as possible; and (e) compatible with existing systems in the helicopter.

The HTC Vive did not comply with req. (a). As soon as the base stations sense motion, the rotating IR emitter inside stops and the tracking immediately cancels. While having the overall best, fastest, and most current tracking results, the Polhemus VIPER could not be used with the relatively strong electromagnetic field of the HoloLens and therefore did not comply with req. (b). With the sensor directly mounted to the HoloLens tracking errors up to 20 deg, resulting from electromagnetic disturbances could be observed. A mapping of the electromagnetic field or mounting the sensor on a boom outside the disturbances was considered but discarded due to the increased effort. The ART SMARTTRACK3 showed the best tracking results from the tested optical trackers. Even with disturbances coming from sunlight, a precise tracking solution could be calculated. Since the cameras must be positioned to observe the pilots head, a position near the helicopters instrument panel is unavoidable. It was not possible to find a place for the SMARTTRACK3 where the system does not obstruct the pilots sight or conflict with controls, which is a mandatory requirement for airworthiness certification. Furthermore, structural modifications would have been necessary to mount the 1.5-kg heavy system on the instrument panel. As a result, this system did not comply with req. (c). The TrackIR complies with all requirements listed above. As the tracking result is not as precise as other trackers, it cannot be used as a permanent solution. For this proof of concept demonstration, the precision was sufficient. A big advantage of the system is the compact size and lightweight design, which decreased airworthiness certification efforts.

6.4.

Software

To access and prepare the tracking data from various EHTs and send it to the HoloLens, an application named head tracking server (HTS) was developed. The HTS is a cross-platform application currently supporting Windows and Linux with the goal of being as lightweight as possible to run on a PC within the ACT/FHS experimental system. It is written in C++ and uses an object-oriented multithreading approach with an update rate up to 1000 Hz. A user interface was created using ncurses (Linux) and the corresponding Windows version pdcurses. The SDKs provided by the tracker manufacturers described above were included and could be called at runtime using classes derived from the generic tracker class. When new data from a tracker is available, a preprocessing transforms the pose into axes directions of the HoloLens interface and then it is transformed again into a defined center. This center is manually placed on the head center of the vehicle during initialization. A logger running periodically is included to debug the tracker data. A function to measure the current reading update rate of the tracker SDK interface is also included.

6.5.

Sunlight Interference with TrackIR

TrackIR 5 was not designed to operate outdoors and it does not work at all in sunlight. The problem can be divided into two categories: (a) direct sunlight on the camera and (b) objects in the camera view are in direct sunlight. As for (a), it was fixed with a 3D printed glareshield for the TrackIR camera. To fix (b), tests with IR and IR cut filters were performed. As a result, 24 layers of car tint film with 20% light transmission filter out the sunlight reflections, but the IR radiation of the LEDs is still detected. Only areas with reflectivity of white or better are still strong enough to penetrate. It is critical to bring the film together cleanly, without interference on the surface, otherwise measurement noise will arise. The shape should be similar to the surface of the camera, otherwise, distortions due to refraction of light will occur on the tint films. Note that this worked as a proof of concept, but there are a large number of external trackers that solve the sunlight problem internally. If the selected tracker has problems with sunlight, fewer layers with less light transmission should be preferred.

Tests with these adjustments shows that sunlight disturbances usually no longer occur or occur very rarely. If it happens anyway, the head tracking jumps very noticeably from the user’s point of view. A similar error occurs when the tracking is outside of its FOV. A so-called garbage filter has been implemented in order to identify incorrect tracking data. It is not based on TrackIR but works the same for other trackers. The filter also uses the internal tracking as a reference to check the validity of the external one. If the signal is classified as false, it is not used by the Kalman filter in the respective time step. Short interruptions are no problem because the external tracker is used as a long-term reference anyway.

As a side note, HoloLens’ depth camera also emits IR radiation, which disturbs TrackIR. Therefore, the camera was covered with tape.

7.

Kalman Fusion

The fusion of internal and external head tracking data is done with a Kalman filter. The internal tracker is free of jitter but drifts. In contrast, the external tracker jitters a lot for various reasons and it is delayed due to latency. Therefore, the internal data should be used in the short term and the external data in the long term. The general idea of the fusion is: “primarily use the internal tracker but drift to the external.”

Different fusion methods were implemented for testing and comparison. A complementary filter, as a straightforward approach, and most importantly two Kalman filters, one that only fuses the heading and one that fuses all three Euler angles. The heading-only Kalman filter is simple and it uses the world space heading of the external tracker. In pitch and roll, gravity swim and its estimated inversion are present. Apart from the residual error of the gravity swim estimation, this is a very robust tracking mode that was often used as a fallback mode or for testing. This was also the primary mode for the car pretests, because due to the data acquisition via iPhone, there are large pitch and roll errors (10  deg) in the cars attitude when it accelerates. The full Kalman filter follows these errors so the resulting tracking is wrong and useless. These problems do not occur in the helicopter because of the better navigation systems (GNSS/INS). The full Kalman filter is intended as the final tracking mode in the helicopter. The state vector of the filter contains the Euler angles, the rotation rates, the drift error, and the change rates of the drift error. The internal tracking signal is handled by the measurement matrix as a combination of Euler angles and drift error. For the implementation, the relationship between Euler angle rates and body axis rates must be considered. In addition, the internal and external Euler angles are not given in the same space because the internal center has drifted.

In the event of signal dropouts of the external tracker, which are identified by the garbage filter, the drift behavior is improved by pseudomeasurements of the drift. This is particularly useful for TrackIR due to the small FOV with frequent dropouts. For heading, it is good to keep the drift speed. For pitch and roll it is not really drift but gravity swim, so it is better to keep the current drift error and zero the rate. Without this adjustments, the drift behaves kind of arbitrarily because the filter lacks information about it in this situation.

If the Kalman fusion is implemented in this way, the world-fixed holograms slightly move compared to the real world with faster head movements. Although the errors are small, they are still very noticeable and therefore affect the immersion. Apart from that, the Kalman fusion drifts too slowly to the external tracker during a strong gravity swim. Primarily to fix these errors, a dynamic weighting of the external and internal tracker signals was made. With a feedforward neural network, as shown in Fig. 9, the standard deviations of the signals are adjusted based on head movement and gravity swim. In the case of fast head movements, the internal tracker is trusted more and in the case of fast gravity swim movements, the external tracker is given greater weight.

Fig. 9

Dynamic weighting of the Kalman fusion filter. (a) The exponents to base 10 of the standard deviations of the IHT and (b) accordingly for the EHT. The standard deviations are used for the covariance matrix of the observation noise.

OE_60_10_103103_f009.png

8.

Calibration

What EHTs should ultimately deliver is the pose of the head relative to the vehicle. But the tracker just measures its target relative to its base or relative to a manually defined center. Therefore, the calibration is required to transform the tracker outputs into the desired spaces.

For TrackIR, LEDs are the target and the IR camera represents its base. The calibration is explained here using the example of TrackIR. However, it works for any external tracker, for any mounting position of target or base, and for any set center position. This was particularly useful because of the numerous trackers tested in combination with the various application locations, such as office, car, simulator, or helicopter. TrackIR’s center can be set to the current LED pose via the tracker software or the HTS. It is, therefore, possible to manually position this center at a defined point in the vehicle in order to manually calibrate it. However, this is an inaccurate and unnecessary effort as it can be automated.

The calibration is carried out during initialization while the vehicle is still stationary. Therefore the default, fully functioning internal tracking of the HoloLens with active environmental cameras can be used at this point. The basic idea of the calibration presented here is to use this full internal tracking as a reference to determine the required transformations for the external tracker. The internal tracking also requires its own calibration to overlay the virtual world exactly on the real world. This is done manually, with existing real objects also being displayed as a hologram and thus serving as a visual reference. A holographic line was used on the nose boom for the helicopter, as shown in Fig. 10. The virtual world is shifted manually in x, y, z and heading to perform the calibration of the internal tracking. With the HoloLens, pitch and roll are automatically aligned according to gravity. This calibration of the internal tracking is only carried out in order to align the calibration of the external tracker with it.

Fig. 10

Manual calibration of the full internal tracking in order to align it with the real world using the noseboom as a reference. This is then used as a reference to calibrate the external tracker with it.

OE_60_10_103103_f010.png

The derivation of the calibration for an external tracker will be explained below with reference to Fig. 11. TrackIR’s system with its IR camera is shown on the left. It outputs the pose of the LEDs relative to its center, represented as transformation matrix D. The transformation matrices used here (D, C, Z, and H) are homogeneous 4D matrices, each containing the position and rotation of its object. The LEDs are attached to the HoloLens, represented by C. If the HoloLens is positioned so that the attached LEDs are in its center (TrackIR output D= identity matrix), the pose of the HoloLens relative to its internal center is defined as Z. This internal tracking center is identical to the head center in the vehicle due to the calibration of the internal tracker described above. The pose of HoloLens’ internal tracking is represented by H. In this mathematical description, H and D are given, and C and Z are searched for. In addition, the external pose must be identical to the internal pose:

Eq. (3)

ZC1DC=!H,

Eq. (4)

DCCZ1H=!0.

Fig. 11

Derivation of the calibration for EHTs, as a 2D top-down illustration of the 3D problem. Note that external pose and internal pose represent two different measurements of the head pose. They differ due to measurement errors (in D and H) and calibration errors (in C and Z). The calibration minimizes the difference between the two poses.

OE_60_10_103103_f011.png

This equation has no unique solution. Furthermore, H and D can change over time, whereas C and Z are assumed to be constant. As a side note: if C changes because the attachment is changed or Z changes because it is centered again, then a recalibration is needed. Now the idea is to record several samples (index k) of D and H to solve it as an optimization problem. The error matrix with E=(eijk)R4×4×N of one sample is

Eq. (5)

Ek=DkCCZ1Hk=!0,
and the optimization problem is

Eq. (6)

minimizexR12f(x)=k=1Nj=13i=14eijk2+pk=1Ni=14ei4k2.

This is the squared sum of all elements in E, where the positional error entries, defined in meters, are scaled down by p=0.1. With p, it can be determined how rotation and position errors are weighted against each other by the optimizer. So p represents the angle error in degree, which has the same effect on f(x) as a position error of 3.5  cm, here 0.1°=^3.5  cm. This follows from how a rotation matrix changes with small angles, the sum of the absolute entries in the linearized derivation is 2·π/1800.0349. N is the number of recorded samples and x contains positions and rotations as Euler angles to create the transformation matrices C(x) and Z(x):

Eq. (7)

x=[[xCyCzC],[ϕCθCψC],[xZyZzZ],[ϕZθZψZ]].

To numerically solve this optimization problem the “limited-memory Broyden–Fletcher–Goldfarb–Shanno” algorithm is used.55

Note that when recording samples of D and H there is usually latency when acquiring the external tracker data D. Therefore, the recording is stopped during dynamic movements of the head and switched on when the head is briefly stationary. The samples should be distributed roughly evenly over the FOV of the external tracker.

With this calibration or a slightly expanded structure, static measurement errors of the external tracker can also be largely reduced. An example of such errors is camera distortions that are still present or that occur for example due to the stacked car tint films against the Sun.

9.

Certification

In order to test the described system in a flying helicopter, all equipment and changes to the helicopter needed to be proven airworthy for experimental purposes. The used helicopter ACT/FHS is built and operated solely for experimental purposes and therefore provides an airworthy safety concept with an EP and an SP. This concept allows the use of development software and control command models with a failure probability of 1 (meaning it can fail at any time). When a failure occurs during an experiment, the EP can switch off the experiment via a switch on the controls or the SP takes over the controls and simultaneously cancels the experiment. For the described experiments, this means that the developed software did not have to follow standards such as the aviation norm DO-178C56 of the Radio Technical Commission for Aeronautics (RTCA). For the installed hardware components such as the HoloLens, EHT, Raspberry Pi and power converters, various mechanical and electrical developments as well as compliance testing and documentation were necessary. Proofs of compliance with airworthiness standards were necessary mainly for the following mechanical issues: (1) crash, vibration, and acceleration-proof mounting and (2) deposition of HoloLens and keyboard during critical flight phases. Proofs for electrical issues were (3) no influence on helicopter power systems, (4) no influence of radiation (Wi-Fi, Bluetooth, and other electromagnetic emissions) on aviation equipment, (5) safety of lithium-ion batteries, and (6) procedures in case of short circuits and fire.

An example for a proof of compliance is the test for electromagnetic compatibility (EMC). This is necessary to prove that the emissions of the device do not disturb any aircraft equipment. The tests have been performed according to RTCA DO-160G57 in DLR’s EMC chamber with properly calibrated measurement equipment. The results for the complete modification setup (HoloLens, Raspberry Pi, Wiring, and power converter) as shown in Fig. 12 show significant peaks at 2.4 GHz (Bluetooth and Wi-Fi) and 5 GHz (802.11ac Wi-Fi). Those emissions and a small peak around 1.5 GHz coming from the HoloLens exceed the limits for category M equipment defined in the RTCA DO-160G visualized as red solid line in Fig. 12. As a result of the limit violations, a test in the running helicopter on ground had to be performed. The test verified that no avionic equipment was affected by the electromagnetic emissions of the HoloLens.

Fig. 12

Results of the EMC measurement of the complete modification setup.

OE_60_10_103103_f012.png

10.

Flight Tests

The flight campaign was intended as a system test with a focus on the proposed head tracking solution in a proof-of-concept demonstration. Figure 13 shows an example of the holograms that the pilot sees. The quality of the head tracking is determined by the error between real and virtual world.

Fig. 13

View through the HoloLens during in-flight system tests.

OE_60_10_103103_f013.png

Some of the holograms are world-fixed like horizon circle with pitch ladder, heading tape, obstacle highlights, or a tunnel-in-the-sky. If the real counterpart is also visible, such as the horizon or an obstacle, the errors between real and virtual world can be perceived and measured. This errors define the quality of the head tracking, and can be split up in positional and rotational errors. As stated previously, position errors are transferred one-to-one to the position of the hologram. The head’s freedom of movement in a helicopter or car is relatively small (20  cm). In addition, the relative error increases the closer the hologram is positioned to the viewer. The head tracking solution presented here does not utilize positional tracking and therefore always includes the full position error of the head relative to its center. However, since the distances to the holograms are large enough, this error does not matter here. In the case of holograms that are pseudoattached in the vehicle, positional tracking would have to be added. For this purpose, data from the external tracker could be used as a straightforward but qualitatively moderate option.

Rotational errors, on the other hand, are very critical. As an example in Fig. 13, the error between the red and blue line is less then half a degree but still very clearly perceptible in the picture and even better when using the HoloLens. In addition, human perception is specialized in seeing movements in the world, with rapid movements becoming even more evident. Therefore, minimal jitter is important for immersion and thus for a good head tracking quality.

The general method of these first system tests was to find all possible head tracking errors. This was done by looking directly through the HoloLens or recording. Using the horizon line, pitch, and roll errors can always be seen if the real horizon is also visible. In addition, some selected wind turbines with its associated obstacle holograms were used. Long straight sections of railroad tracks, canals, or roads were used to check the heading in isolation. The real heading of these objects and the GPS coordinates of the wind turbines were acquired from Google Earth. Also head and helicopter movements were always considered to find or maximize errors, in an escalating order of combinations. Furthermore, various head tracking states, fusion methods, and gravity swimming were examined in the flight test program, such as IHT, EHT, complementary filter, full Kalman filter, heading-only Kalman filter, and the gravity swim filter.

In addition to the holograms, which assist the pilot, four debug holograms were used to analyze the head tracking in this type of system tests. These are shown in Fig. 14. The first and most important shows the external tracker and its error to the tracking currently used (green square and white cross). Since the green square represents only the LOS, additional clock hand lines indicate the roll error. The error also includes jitter and latency of the external tracking. The latency becomes visible when the head is moved. The low-frequency portion of this error shows where the fusion will drift. The second debug hologram are additional horizon lines to visualize either the horizon of HoloLens’ internal tracking (blue line Fig. 14) or hypothetical horizons of further gravity swim filters to compare which one works best. Third is the nose boom line, it is the only hologram in this project that is attached directly in the vehicle. It supports the assessment of the tracking error but also always includes the full position error of the head. The fourth and final debug hologram is the enlarged pitch ladder with a small step size of 1 deg. All ladder steps can be drawn in a full circle analogous to the horizon. Either way, the tracking error can be quantified with a certain accuracy, similar to a ruler.

Fig. 14

Debug holograms to visualize errors and intermediate steps of the head tracking.

OE_60_10_103103_f014.png

Errors in head tracking can be divided into short-term behavior such as jitter or latency and long-term behavior such as offsets or drift. The jitter and latency of world-fixed holograms are similar but slightly worse than HoloLens’ quality in normal operation on the ground, which means there is almost no jitter and no latency. At least it was so small that it could not be measured with the recording method in the vibrating helicopter. These vibrations, excessive head shaking, and rotational movement of the helicopter also had no noticeable effect on the jitter or latency. The long-term behavior was not so perfect but still decent. The main reason was using TrackIR as external tracker, with its small FOV, the slight distortions toward the edges due to the stacked car tint films on the camera of TrackIR and other minor errors. However, this can only be confirmed with certainty in future tests when TrackIR is replaced by a better external tracker.

Note that even when head tracking errors occur, the entire virtual world does not align with the real world, but the virtual world in itself is still correct. Head-fixed displays are unaffected, but they are inherently jittered in the HoloLens because of the postrendering image warp, so they were filtered, just like in the simulator.11 Regardless of this, one feedback from the pilot was that the lower edge of the visor is exactly above the PFD when the head is in a central position so that the head instead of just the eyes have to be moved.

The EHT-only mode has also been examined and is very useful for testing and debugging. But due to strong jitter and high latency, the quality is far worse than the fusion modes. The FULL-IHT mode was also tested. In the flying helicopter, the covers of the environmental cameras of the HoloLens were removed. The weather was gray and foggy with occasional weak cloud contours. As with the test in a car, it did not work, and it was even more faulty in that it drifted away quickly around the roll axis, even though the helicopter was flying stationary. The tests for gravity swim did not result in any surprises, and it was very similar to the preliminary tests in the car. Only other maneuvers could be flown, such as larger turns with bank angle. It worked well for simple speed changes and just as in the car, the identified filter was worse in turns. Even if it does not completely fix the error, every mitigation helps to improve the overall result.

In addition to the head tracking tests, some symbology studies were also carried out. This involves a simple test of the visibility of different line colors. As can also be seen in Fig. 15, the pure blue colors can be seen very poorly or not at all because they are very dark. Further tests included approaching and flying through a generic LIDAR point cloud, as shown in Fig. 16. It is significantly better than the monochrome alternative and has potential when used correctly. Nevertheless, there is a lot of clutter that obscures large parts of the view. Another test was flying various tunnels, as can be seen in Fig. 16. The tunnels were very immersive, stable, and easy to fly. There was hardly any noticeable difference to the simulator.

Fig. 15

Testing visibility of line colors.

OE_60_10_103103_f015.png

Fig. 16

Tests carried out for (a), (c) visualization of generic LIDAR data and (b), (d) various tunnel representations.

OE_60_10_103103_f016.png

11.

Conclusion and Outlook

As demonstrated on a helicopter, the Microsoft HoloLens 2 can be used on moving vehicles as a head-worn display. Therefore, such a HoloLens-HMD can be used as a basis in rapid development pipelines to test all kinds of new head-worn AR symbology. In addition, there are many more general questions that can only be investigated with such or a similar HMD system.

The proposed head tracking solution uses the IMU-only tracking of the HoloLens and compensates the resulting drift errors with an external tracker. The HoloLens goes into IMU-only tracking when the cameras are covered. With a system identification approach, the error that arises from vehicle accelerations could be reduced by 80% for straight vehicle accelerations and by about 50% in turns. The remaining error is handled by the external tracking, respectively, the fusion.

An alternative head tracking structure, which was also implemented and examined in the flight test, is to ignore the internal tracking entirely and only use the external tracker. However, this solution is much poorer in quality because of strong jitter. The jitter mainly comes from the HoloLens’ internal postrendering image warp and due to latencies.

Various EHTs were examined, tested, and compared. An application was developed to read out all external trackers, to prepare and generalize the data for the HoloLens. TrackIR 5 was selected as an external tracker for the flight tests, mainly because it was easy to integrate into the helicopter. It is a very simple tracker designed to be used indoors and not working in the Sun. As workaround, a glare shield was 3D-printed for the camera and in addition, more then 20 layers of car tint film were used on the camera. With it TrackIR 5 could be used in the Sun, the HoloLens also works in the Sun.

The fusion is done with a Kalman filter. The internal IMU is free of jitter but drifts. In contrast, the external signal jitters a lot but does not drift. Therefore, the internal data are used in the short term and the external data in the long term. In order to further minimize the head tracking errors, a dynamic weighting of the observation noise was added. With fast head movements, the internal tracker is trusted more and with strong errors due to vehicle accelerations the external tracker is given more weight. Furthermore, a complementary filter and a heading-only filter have been implemented. Both were very useful in the development process.

The calibration of the external tracker was derived as an optimization problem and is therefore largely automated. The fully functional HoloLens internal tracking serves as a reference while the vehicle is not yet moving. This HoloLens tracking is calibrated manually with the help of the noseboom.

In that way, very immersive world-fixed holograms can be created in color without jitter or latency, to potentially aid almost every task that occurs with vehicles. Lower frequency errors, on the other hand, mainly depend on the selected external tracker. Therefore, the accuracy should be high, the FOV should be large enough (vehicle and task dependent), and the tracker should work in direct sunlight. Loss of head tracking, as known from working with the HoloLens, never happened with the proposed solution, so it is very reliable in this regard.

A certification was necessary to use the HoloLens as HMD in flight tests with DLR’s research helicopter ACT/FHS. An overview was given for the key points of the certification and results of the EMC measurements were shown. In particular, the proof that the HoloLens’ Wi-Fi does not interfere with the rest of the helicopter electronics caused some problems. However, due to the safety concept of the helicopter, it was not necessary to prove that the holograms do not disturb the pilot in his flight task. For other use cases, the certification can become time-consuming and problematic.

The flight campaign was carried out successfully. The presented head tracking solution was examined for errors, and it could be demonstrated that this solution works very well as a concept. Various debug holograms were developed and used in this system test. In addition, the HMD symbologies were also tested by the pilots using them relatively freely.

The system benefits from all existing features of the HoloLens, current and potential futures, such as hand and eye tracking, depth camera and other sensors, remote rendering, and spatial mapping. On the other hand, some of the head tracking developments are laborious reverse engineering workarounds because there is no full access to the HoloLens.

Now, knowing that this head-worn display pipeline works with very good quality, it will be iteratively further developed in the future in order to use it for its intended purpose—researching AR assistance for operators in their real vehicles, with a focus on helicopters.

The mobile setup of this HMD system that was used in the car should be further developed toward more autonomy. So that it can be used flexibly in any vehicle without any integration effort. In order to replace the smartphone as a vehicle state sensor, better GNSS/INS systems are planned to be examined. Another weak point is the TrackIR as an external tracker, which worked well in this proof of concept demonstration, but with a better tracker, the overall quality can be significantly improved. Particularly problematic are the small horizontal FOV with about ±45deg but also measurement errors, which are partly caused by the stacked layers of car tint film. Therefore, future research will also concentrate on the usability and efficiency of lightweight tracking technologies. A replacement could be the external tracker ARTTRACK6/M, which is similar to SMARTTRACK3 but much more compact. A completely different approach is a localization based on the depth camera of the HoloLens using the known cockpit. This would make the external tracker unnecessary.

Furthermore, ergonomic and psychological aspects of using COTS AR glasses in flying vehicles will be evaluated in the future. There is a huge variety of possibilities to assist the pilot with AR displays. On top of that, there is a large variety of tasks that can be improved in performance, safety, workload, situational awareness, and comfort. Therefore, there is a lot of room for further research, whereby we are currently mainly oriented toward rescue helicopters and urban air taxis.

Acknowledgments

This work was mainly sponsored by the Program Coordination Defence & Security Research (PK-S) of the German Aerospace Center with the Project HEDELA. We would like to thank the German Federal Police Aviation Service for their support as an associated partner in the project.

References

1. 

C. Spitzer, U. Ferrell and T. Ferrell, Digital Avionics Handbook, CRC Press, Boca Raton, Florida (2017). Google Scholar

2. 

R. Kalawsky, “The science of virtual reality and virtual environments,” Cambridge (1993). Google Scholar

3. 

C. J. Casey, “Helmet-mounted displays on the modern battlefield,” Proc. SPIE, 3689 270 –277 (1999). https://doi.org/10.1117/12.352839 PSISDG 0277-786X Google Scholar

4. 

M. P. Browne, “Head-mounted workstation displays for airborne reconnaissance applications,” Proc. SPIE, 3363 348 –354 (1998). https://doi.org/10.1117/12.321785 PSISDG 0277-786X Google Scholar

5. 

B. D. Foote, “Design guidelines for advanced air-to-air helmet-mounted display systems,” Proc. SPIE, 3362 94 –102 (1998). https://doi.org/10.1117/12.317422 PSISDG 0277-786X Google Scholar

6. 

R. A. Belt, J. Kelley and R. J. Lewandowski, “Evolution of helmet-mounted display requirements and Honeywell HMD/HMS systems,” Proc. SPIE, 3362 373 –384 (1998). https://doi.org/10.1117/12.317451 PSISDG 0277-786X Google Scholar

7. 

M. M. Bayer, C. E. Rash, J. H. Brindle, “Introduction to helmet-mounted displays,” Helmet-Mounted Displays: Sensation, Perception and Cognition Issues, 47 –108 Army Aeromedical Research Laboratory, Fort Rucker, Alabama (2009). Google Scholar

8. 

J. E. Melzer and C. E. Rash, “The potential of an interactive HMD,” Helmet-Mounted Displays: Sensation, Perception and Cognition Issues, 877 –898 2009). Google Scholar

9. 

J. Uhlarik et al., “A review of situation awareness literature relevant to pilot surveillance functions,” (2002). https://rosap.ntl.bts.gov/view/dot/40726 Google Scholar

10. 

A. M. Cook, “The helmet-mounted visual system in flight simulation,” (1988). https://repository.exst.jaxa.jp/dspace/handle/a-is/361994 Google Scholar

11. 

C. Walko and N. Peinecke, “Integration and use of an augmented reality display in a maritime helicopter simulator,” Opt. Eng., 59 (4), 043104 (2020). https://doi.org/10.1117/1.OE.59.4.043104 Google Scholar

12. 

C. J. Casey and J. E. Melzer, “Part-task training with a helmet-integrated display simulator system,” Proc. SPIE, 1456 175 –178 (1991). https://doi.org/10.1117/12.45441 PSISDG 0277-786X Google Scholar

13. 

R. Simons and J. E. Melzer, “HMD-based training for the U.S. Army’s AVCATT-A collective aviation training simulator,” Proc. SPIE, 5079 1 –6 (2003). https://doi.org/10.1117/12.487192 PSISDG 0277-786X Google Scholar

14. 

S. P. Rogers, C. N. Asbury and L. A. Haworth, “Evaluation of earth-fixed HMD symbols using the prisms helicopter flight simulator,” Proc. SPIE, 3689 54 –65 (1999). https://doi.org/10.1117/12.352845 PSISDG 0277-786X Google Scholar

15. 

S. P. Rogers, C. N. Asbury and Z. P. Szoboszlay, “Enhanced flight symbology for wide-field-of-view helmet-mounted displays,” Proc. SPIE, 5079 321 –332 (2003). https://doi.org/10.1117/12.487289 PSISDG 0277-786X Google Scholar

16. 

J. C. Jenkins, “Development of helmet-mounted display symbology for use as a primary flight reference,” Proc. SPIE, 5079 333 –345 (2003). https://doi.org/10.1117/12.487423 PSISDG 0277-786X Google Scholar

17. 

J. C. Jenkins, A. J. Thurling and B. D. Brown, “Ownship status helmet-mounted display symbology for off-boresight tactical applications,” Proc. SPIE, 5079 346 –360 (2003). https://doi.org/10.1117/12.487419 PSISDG 0277-786X Google Scholar

18. 

J. C. Jenkins, D. G. Sheesley and F. C. Bivetto, “Helmet-mounted display symbology for enhanced trend and attitude awareness,” Proc. SPIE, 5442 164 –178 (2004). https://doi.org/10.1117/12.544021 PSISDG 0277-786X Google Scholar

19. 

N. Peinecke et al., “Review of conformal displays: more than a highway in the sky,” Opt. Eng., 56 (5), 051406 (2017). https://doi.org/10.1117/1.OE.56.5.051406 Google Scholar

20. 

W. B. Albery, “Multisensory cueing for enhancing orientation information during flight,” Aviat. Space Environ. Med., 78 (5, Suppl.), B186 –B190 (2007). Google Scholar

21. 

M.-J. Maibach, M. Jones and C. Walko, “Using augmented reality to reduce workload in offshore environments,” in Vertical Flight Soc. 76th Annu. Forum and Technol. Disp., (2020). Google Scholar

22. 

F. Viertler and M. Hajek, “Evaluation of visual augmentation methods for rotorcraft pilots in degraded visual environments,” J. Am. Helicopter Soc., 62 (1), 1 –11 (2017). https://doi.org/10.4050/JAHS.62.012005 JHESAK 0002-8711 Google Scholar

23. 

T. Mehling et al., “Visual augmentation for personal air vehicles during flight control system degradation,” in Vertical Flight Soc. 76th Annu. Forum and Technol. Disp., (2020). Google Scholar

24. 

C. Walko and B. Schuchard, “Increasing helicopter flight safety in maritime operations with a head mounted display,” in 45th Eur. Rotorcraft Forum, (2019). Google Scholar

25. 

J. M. Ernst, L. Ebrecht and B. Korn, “Virtual cockpit instruments—how head-worn displays can enhance the obstacle awareness of helicopter pilots,” IEEE Aerosp. Electron. Syst. Mag., 36 (4), 18 –34 (2021). https://doi.org/10.1109/MAES.2021.3052304 IESMEA 0885-8985 Google Scholar

26. 

A. H. Morice et al., “Ecological design of augmentation improves helicopter ship landing maneuvers: an approach in augmented virtuality,” PLoS One, 16 (8), e0255779 (2021). https://doi.org/10.1371/journal.pone.0255779 POLNCL 1932-6203 Google Scholar

27. 

A. Riegler, A. Riener and C. Holzmann, “A research agenda for mixed reality in automated vehicles,” in 19th Int. Conf. Mob. and Ubiquitous Multimedia, 119 –131 (2020). Google Scholar

28. 

J. L. Gabbard, G. M. Fitch and H. Kim, “Behind the glass: driver challenges and opportunities for AR automotive applications,” Proc. IEEE, 102 (2), 124 –136 (2014). https://doi.org/10.1109/JPROC.2013.2294642 IEEPAD 0018-9219 Google Scholar

29. 

R. Häuslschmid et al., “Augmenting the driver’s view with peripheral information on a windshield display,” in Proc. 20th Int. Conf. Intell. User Interfaces, 311 –321 (2015). Google Scholar

30. 

S. Tachi, M. Inami and Y. Uema, “Augmented reality helps drivers see around blind spots,” (2014). Google Scholar

31. 

J.-W. Lee et al., “Development of lane-level guidance service in vehicle augmented reality system,” in 17th Int. Conf. Adv. Commun. Technol., 263 –266 (2015). Google Scholar

32. 

C. Yoon et al., “Development of augmented in-vehicle navigation system for head-up display,” in Int. Conf. Inf. and Commun. Technol. Convergence, 601 –602 (2014). https://doi.org/10.1109/ICTC.2014.6983221 Google Scholar

33. 

K. Shelton et al., “Flight test of a head-worn display as an equivalent-HUD for terminal operations,” Proc. SPIE, 9470 94700X (2015). https://doi.org/10.1117/12.2177059 PSISDG 0277-786X Google Scholar

34. 

III J. T. J. Arthur et al., “Simulation test of a head-worn display with ambient vision display for unusual attitude recovery,” Proc. SPIE, 10197 101970B (2017). https://doi.org/10.1117/12.2262705 PSISDG 0277-786X Google Scholar

35. 

N. de Oliveira Faria, “Evaluating automotive augmented reality head-up display effects on driver performance and distraction,” in IEEE Conf. Virtual Reality and 3D User Interfaces Abstracts and Workshops, 553 –554 (2020). https://doi.org/10.1109/VRW50115.2020.00128 Google Scholar

36. 

W. Ma, A. Heath and N. Wingfield, “Apple eyes 2022 release for AR headset, 2023 for glasses,” (2019). Google Scholar

37. 

E. Gernez et al., “A review of augmented reality applications for ship bridges,” (2020). Google Scholar

38. 

H. Duda et al., “Design of the DLR AVES research flight simulator,” in Proc. AIAA Modeling and Simul. Technol. Conf., (2013). Google Scholar

39. 

M. Hamers and W. von Grünhagen, “Nonlinear helicopter model validation applied to realtime simulations,” in Am. Helicopter Soc. 53th Annu. Forum, (1997). Google Scholar

40. 

A. Strbac et al., “Analysis of rotorcraft wind turbine wake encounters using piloted simulation,” in 45th Eur. Rotorcraft Forum, (2019). Google Scholar

41. 

T. Gerlach, “Visualisation of the brownout phenomenon, integration and test on a helicopter flight simulator,” in Royal Aeronaut. Soc. (RAeS) Flight Simulation Conf., (2009). Google Scholar

42. 

J. Gotschlich, T. Gerlach and U. Durak, “2Simulate: a distributed real-time simulation framework,” in ASIM STS/GMMS Workshop, (2014). Google Scholar

43. 

C. Walko, “Integration of augmented-reality-glasses into a helicopter simulator with front projection,” in Deutscher Luft- und Raumfahrtkongress, (2018). Google Scholar

44. 

C. Walko and B. Schuchardt, “Increasing helicopter flight safety in maritime operations with a head-mounted display,” CEAS Aeronaut. J., 12 (1), 29 –41 (2021). https://doi.org/10.1007/s13272-020-00474-7 Google Scholar

45. 

M.-J. Maibach, M. Jones and A. Strbac, “Development of a simulation environment for maritime rotorcraft research applications,” in Deutscher Luft-und Raumfahrtkongress, (2020). Google Scholar

46. 

B. C. Kress and W. J. Cummings, “Optical architecture of HoloLens mixed reality headset,” Proc. SPIE, 10335 103350K (2017). https://doi.org/10.1117/12.2270017 PSISDG 0277-786X Google Scholar

47. 

J. Kaletka, H. Kurscheid and U. Butter, “FHS, the new research helicopter: ready for service,” Aerosp. Sci. Technol., 9 456 –467 (2005). https://doi.org/10.1016/j.ast.2005.02.003 Google Scholar

48. 

W. R. Mark, L. McMillan and G. Bishop, “Post-rendering 3D warping,” in Proc. Symp. Interactive 3D Graphics, 7 (1997). Google Scholar

49. 

A. Schwarz et al., “Drive ahead-a large-scale driver head pose dataset,” in Proc. IEEE Conf. Comput. Vision and Pattern Recognit. Workshops, 1 –10 (2017). https://doi.org/10.1109/CVPRW.2017.155 Google Scholar

50. 

A. Firintepe et al., “The more, the merrier? A study on in-car IR-based head pose estimation,” in IEEE Intell. Veh. Symp., 1060 –1065 (2020). https://doi.org/10.1109/IV47402.2020.9304545 Google Scholar

51. 

Polhemus, “Viper data sheet,” (2020). Google Scholar

52. 

Advanced Realtime Tracking, “SMARTTRACK3 data sheet,” (2020). Google Scholar

53. 

Y. Yang et al., “An improved method of pose estimation for lighthouse base station extension,” Sensors, 17 (10), 2411 (2017). https://doi.org/10.3390/s17102411 SNSRES 0746-9462 Google Scholar

54. 

NaturalPoint, “TrackIR 5—in depth,” (2021). Google Scholar

55. 

S. Wright and J. Nocedal, Numerical Optimization, 35 67 –68 Springer Science, Vol. 1999). Google Scholar

56. 

“Software considerations in airborne systems and equipment certification,” (2011). Google Scholar

57. 

“Environmental conditions and test procedures for airborne equipment,” (2014). Google Scholar

Biography

Christian Walko completed his BS and MS degrees in aerospace engineering at the Technical University of Berlin. He has worked as a research engineer at German Aerospace Center since 2016. His research interests include head-mounted displays, augmented reality, human–machine interfaces, active inceptors, flight control systems, artificial intelligence, neural networks, machine learning, and image processing.

Malte-Jörn Maibach received his master’s degree in aerospace engineering from Braunschweig University of Technology in 2016. He works as a research scientist and project manager at the German Aerospace Center (DLR e.V.). His research interests include visual assistance systems for helicopters, head and eye tracking technologies, and maritime flight simulation fidelity metrics.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Christian Walko and Malte-Jörn Maibach "Flying a helicopter with the HoloLens as head-mounted display," Optical Engineering 60(10), 103103 (21 October 2021). https://doi.org/10.1117/1.OE.60.10.103103
Received: 12 April 2021; Accepted: 16 September 2021; Published: 21 October 2021
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Head

Head-mounted displays

Holograms

Cameras

Calibration

Optical engineering

Filtering (signal processing)

Back to Top