BAE Systems is seeking to make the jet fighter cockpit of tomorrow a much simpler place by replacing conventional instruments and controls with a virtual reality system. This "wearable cockpit" would use artificial intelligence and eye-tracking technology to allow pilots to control their aircraft simply by looking and gesturing.

The vision is that instead of a complex arrays of dials, touchscreens, buttons, and knobs, the sixth generation fighter interior will be dominated by wide expanses of blank plastic panels with only a few of the most vital readouts and controls available.

That is, until the pilot puts on their helmet and turns it on. Then the panels, canopy, and even the pilot's person will be festooned with readouts and controls designed to quickly provide critical information and respond in the most efficient way possible. It will also be a cockpit that can be reconfigured as easily as the home screen on a smartphone.

"In terms of future concepts, we are looking at what we are calling a 'wearable cockpit'," says BAE Lead Technologist Jean Page "Here, you remove many of the physical elements of the cockpit, and replace it with a virtual display, projected through the helmet. Essentially, it's a software-only cockpit that's upgradeable, adaptable and reconfigurable.

"In such a world, we need to think about what controls are critical to the pilot and then make them easier to manage. Eye-tracking gives you the option of looking at something to highlight it and then making a gesture to 'press' a button, rather than having a series of physical buttons on the aircraft."

Developed as part of the Tempest concept fighter, such a cockpit would have a number of advantages. Some of these would be quite obvious – fewer physical readouts and controls mean less material, less weight, and lower costs to build and maintain. In addition, a virtual cockpit could be modified by simply tweaking the software, and could even be altered in flight to match the mission.

Such a cockpit can even learn, allowing engineers to make it more efficient by, for example, making sure a warning light wasn't set on the left-hand side of the pilot in a situation when he's more likely to be looking to his right. The result would be cues that would be easier to read and easier to react to.

"The really clever bit will be that based on where the pilot is looking, we can infer the pilot's goal and use intelligent systems to support task performance and reduce the pilot's workload," says Page. "We want to do it in a way that doesn't always ask for permission, because that would get very annoying very quickly but equally, it is essential that it is always evident to the pilot what task the intelligent system is performing."

Source: BAE Systems