In our business of flying, “The Perfect” is very much still the enemy of “The Good,” especially when it comes to attaining dependable, economically feasible and broadly available all-weather operations equipage. The good in focus here are enhanced vision systems (EVS).
After years of direct involvement in EVS development and application, I can say without qualification that today’s thermal image-based EVSs are very good. Remarkable, even. In helping pilots see through the murk and the dark, they instill confidence and enhance operational safety. For helicopter pilots landing at night at a remote location, an EVS can be the only means of spotting obstacles and ground personnel.
And if you are appropriately trained, and your aircraft is equipped with a certified EV sensor coupled to a head-up display (HUD), you can legally descend an additional 100 ft. or so below published precision approach minimums. This capability is permitted only if at published decision height, the flying pilot can “see” the “runway ” through the EVS image superimposed on the HUD. If, after descending, the runway is not visible to the pilot using normal unaided human sight, a missed approach is required.
The ultimate goal — perfection, if you will — is the ability to safely conduct air and ground operations in zero-zero conditions as a matter of course. And no, EVS can’t deliver that. Yet.
EVS is currently available in three display configurations, each with a different acronym, price tag and certification. All offer improved situational awareness during times of reduced visibility due to darkness or weather, or both and, as noted, one system provides lower minimums:
EVS: Enhanced Vision System. A stand-alone thermal imaging camera that sees infrared energy emitting, or radiating, from objects and forms a real-time video image that is displayed on an MFD or dedicated video display screen. The system’s primary benefit is improving situational awareness. At night, an EVS eliminates the visual effects of darkness, turning it into day on the display, and enabling the pilot to see and avoid clouds at night. During the day, the system enables the pilot to see through smoke, haze and smog.
EFVS: Enhanced Flight Vision System. A thermal imaging camera that sees infrared energy radiating from objects coupled with and displayed on a HUD flight guidance system. The primary benefits of this more sophisticated arrangement is its approval for primary flight guidance in IFR flight, and it can provide lower minimums. And it, too, enhances the pilot’s situational awareness.
CVS: Combined Vision System. A thermal imaging camera combined with synthetic imagery, whereby the real-time EVS depiction is presented as a translucent overlay on the database-derived synthetic visuals on the PDF. The main benefits of this combined system is the visual addition of transient obstructions in the approach and landing zone. Such a system can be regarded as a preliminary step toward a future “Verified Synthetic Vision System” so accurate to allow precision landings followed by taxiing to the ramp in zero-zero conditions.
EVS, EFVS and CVS can all include sensors in addition to the IR sensor, such as visible light sensors. Data from each senor is electronically “fused” into and contributes to the image. This “fusion” of data from multiple sensors is a key component of advanced systems.
Today’s EFVSs are expensive — roughly $800,000+ — components in avionics suites but a welcome investment when you need an extra 100-ft. descent on the published DH — particularly if it’s at the end of a transoceanic flight, or there’s a high-value perishable, or chief of state, in back.
A simple EVS image on the MFD or stand-alone display can boost the confidence of an IFR-rated private pilot flying a light single or twin at night or in moderate IMC. On a clear, moonless VFR night, a straight EVS will turn night into day on the display, enabling the pilot to see the surrounding terrain, roads, buildings, etc., thus eliminating any night flying sweats. And it will turn a special VFR night into a special VFR day, and maybe add a little range to the reported SVFR visibility as an added benefit. The price for being able to see the otherwise unseeable ranges from about $25,000 to $130,000.
A New Way to See Seeing
To better understand the practical performance expectations of current EVS technology and to appreciate the net operational improvements of future generation systems, it’s helpful to have a basic understanding of how IR imaging works and the limitations that physics imposes on imaging with this energy source.
To begin, thermal imaging works differently than visible light imaging — that is, the way we see.
Take a look at the accompanying illustration. This shows what portion of the electromagnetic (EM) spectrum our eyes register and the relationship of the other frequencies to each other. Yes, light waves are part of the spectrum, which also includes lower frequency radio and television waves, microwaves and radar, infrared (IR), visible and ultraviolet “light,” on up through X-rays, then gamma rays and, finally, cosmic rays.
Think of your eyes as mini-radar systems, tuned to this small “visible” portion of the EM frequency spectrum, registering and resolving into an image the streaming photons (visible light) reflecting off objects in the world. The frequency we call the visible portion of the EM spectrum reflects off most of the stuff on this planet — stuff we need to see to avoid, to survive and to thrive — everything from the Alps and Jessica Alba to airport runways. Accordingly, humans and other animals evolved eyes that “see” that portion of the spectrum.
But unlike radar, human eyes do not generate and project the energy that reflects back to create the image you see — our eyes rely on an outside source, such as the sun, landing lights or even starlight as amplified by night-vision goggles (NVG).
When something like a closing door or light switch “turns off” the stream of reflected photons or blocks or scatters it, as does a window shade or fog, we can no longer see objects clearly because all light is absent, or the images are blocked, or because the existing light gets scattered on the way to our eyes by haze, smoke or the fog’s tightly clustered water molecules.
By contrast, infrared energy, which occupies another portion of the EM spectrum, does not reflect off most stuff. Rather, it is absorbed by matter and then slowly radiates out from it. So, when the sun goes down and the infrared energy it continuously generates stops raining down on the now darkened area of earth, that energy absorbed in the rocks, trees, concrete, grass, cars and buildings radiates out, albeit at a different rate, according to the material’s temperature, density and molecular structure.
Of course, if you are looking at something that is generating its own heat — like, say, Jessica Alba — that object will glow like a beacon against the background of stuff radiating absorbed energy.
An infrared camera sees the energy difference of each surface and converts the minute temperature variance from the object’s various surfaces into an image. Different temperatures are usually depicted in shades of grey, representing rates of IR radiation emitted from the substances in the camera’s view.
There are three segments in the IR portion of the EM spectrum that are useful and, curiously, are fairly evenly spaced along the continuum. These segments are defined by the wave sizes in microns (μ): long-wave IR (8.0 to 14.0μ); mid-wave IR (3.0 to 5.0 μ); and short-wave IR (1.0 to 1.5+μ). Long-wave IR (LWIR) and, to a slightly lesser degree, mid-wave IR (MWIR) are generally considered the best for imaging the world. Short-wave IR (SWIR) cameras had been too expensive and export of the technology by U.S. manufacturers limited under International Traffic in Arms Regulations (ITAR), making them commercially difficult to deploy.
None of the other IR frequencies separating those three can penetrate to the earth’s surface because gases that are common constituents of our atmosphere absorb the energy at their wavelengths. You may encounter the term “near IR” (NIR). This refers to the segment of energy from the eye’s cut offf at about 0.65μ to about 1.0μ. This energy is technically considered “visible light,” but our eyes are not tuned to react to it.
Infrared energy is blocked by glass, so the IR energy generated by a glowing tungsten filament in an incandescent light is blocked by the glass bulb enclosing the filament. Sometimes, if runway lights have been on long enough for their heat to migrate to the light’s assembly, that hot metal base will show up on a long-wave/mid-wave EVS display as a glowing light. The new light-emitting diode (LED) arrays now being installed at many airports emit visible light with almost no “heat,” so they are invisible to LWIR and MWIR sensors.
Neither LWIR nor MWIR will see, register or receive visible light. MWIR cameras can be “stretched” to see a little into the visible band but then become subject to “blooming,” a term describing an incursion into the field of view by a significantly stronger source of energy that causes the image to wash out details. You can liken it to the impact on your vision after a camera flash in your face.
Blooming occurs in a daytime approach through fog where the solar light backscatter conflicts with the sensor, or when a bright light suddenly comes into the field of view and the EVS image processing software struggles to balance the exposure. The software is improving steadily, but image detail can be compromised by blooming on short final, when you need it the most.
Meanwhile, SWIR holds promise. Uncooled SWIR sensors are now coming down in price. In addition, security schemes are being incorporated into the technology that precludes its surviving removal from an EVS assembly. That makes the component viable from a cost/benefit standpoint as well as becoming exportable in accordance with ITAR regulations.
Most EFVSs in service today utilize mid-wave sensors that are cryogenically cooled by liquid nitrogen refrigeration to very low temperatures to gain the sensitivity to see the minute temperature differences required to image the world via thermal sensing. When developed, the award-winning cryo-cooled technology for IR sensing was the leading edge of technology and the best performing means of thermal imaging.
Compared to new, uncooled sensors, the cryo-cooled units are heavy, expensive, require pesky maintenance and involve significant cooldown times before they can operate. Mind you, the microbolometer (un-cooled) sensor technology is on a par with the cooled sensors but is lagging in deployment due to the investment required in time and money to win certification.
Still, it was these cryo-cooled sensors that broke the certification barrier for landing credits — a big technological and regulatory leap forward — and cleared the way for a new generation of uncooled multispectral sensors that will weigh and cost less while delivering lower minimum approaches, and eventually — say, a dozen years or more from now — zero-zero ground and flight operations on a regular, reasonably reliable, and affordable basis.
A drawback problem with today’s EFVS is that, while it can reduce descent altitudes, it does not eliminate missed approaches entirely. Sometimes (maybe more than sometimes) the fog is just too thick for the EVS to pick up the runway. This performance issue, coupled with the burgeoning switch among airports (especially in Europe) from incandescent to energy-efficient, long-lasting LED lighting, threatens to further degrade the dependability of the MWIR single-spectrum-based EFVS.
For years, some airframe and sensor manufacturers invested in the current technology have lobbied thelighting committee to include a thermal generator in the specifications for new runway LED installations. There’s been resistance to the idea since “stuffing an electric hair dryer” in an LED array is at odds with the new lights’ economic and environmental benefits.
Current and Coming
Enhanced vision systems are clearly a boon to safety by providing pilots with enhanced situational awareness in general and, with EFVS, additional landing credits. The systems help reduce weather-caused delays and diversions.
The technology turns night into day — delivering a black-and-white real-time video stream of the world as you fly over it. It lets pilots see through haze, smog and smoke as well as mist and light fog. What exists is good, but what’s coming is even better as avionics engineers pursue two very important technological strategies.
The first is multispectral EVS sensor assemblies, with multiple cameras each discretely tuned to one of several frequencies on the EM spectrum. The information from each sensor will be electronically fused to deliver a composite image comprising the best data from each.
The second strategy is to develop advanced image processing software, capable of real-time teasing of every bit of useful data out of the sensor packs for highest resolution and obscurant penetration. Early releases are being test flown now and are expected to provide good or even significant improvement over mid-wave cryo-cooled generation one sensors. These will likely incorporate HUD systems for approaches while switching to the MFD for situational awareness and taxiing in low visibility.
Today’s combined EVS and SVS technology is a marvel but won’t provide the zero-zero operational ideal performance without further refinement. First, GPS, which helps drive the synthetic image, does not have the accuracy necessary for vertical guidance to touchdown. Second, the geographic and topographical databases at the heart of synthetic vision have errors resulting from atmospheric interference and other impediments encountered during the space shuttle’s detailed scan of the earth’s surface.
Efforts are underway to apply the EVS component of CVS to correcting/eliminating the errors resident in SVS depictions of the world. That would open the door to a possible head-down, or head-up, display option of a synthetic world image that is exactly aligned with the real world, including the depiction of transient obstructions in the aircraft’s path — think Bambi — and enable approach to landing through touchdown, rollout and taxi to the ramp in zero-zero conditions.
Technically zero-zero operational capability exists today but not with economically feasible unclassified hardware. Progress is being made on both fronts and, ultimately, a HUD integrated with a Combined Synthetic Vision System, or a Combined Synthetic Vision System embedded in the windscreen will be key to perfection: landing an FAR Part 25 transport or medical helicopter in zero-zero, as though conditions were clear and a million. B&CA
To see a video on EVS from bcadigital.com/evs_gulfstreamclick here in the digital edition of B&CA or go to