Uber's self-driving car. A lot of people are talking about it, but if you think about it, that technology started right here in the aerospace industry.
Planes could "fly" themselves for years with the help of autopilot. Our traffic avoidance collision system (TCAS) and our enhanced ground proximity warning system (EGPWS) have been keeping planes from crashing into obstacles for years.
We have been carefully thinking about how humans are going to work this type of technology. Although most people are on the fence about flying on a pilotless plane, we believe that you have to be able to strike the perfect balance of pilot awareness and reducing pilot workload with automation tools. Relying on artificial intelligence (AI) and machine learning will help reduce the number of aircraft incidents by giving pilots more support — think of it like adding another co-pilot to the cockpit.
So what exactly are these tools and how will they impact you? Well, here are just a few examples.
Speech Recognition: Say Goodbye to Language Barriers
Right now, we are working on a robust speech translation tool that will help pilots overcome some of these obstacles by transcribing what the pilot is saying to air traffic control or vice versa onto a screen. Providing the information on a screen will allow pilots to refer to it when needed and simplify the takeoff and landing process because pilots will have a visual confirmation of the instructions given. In the long run, such technology will also help prevent hazardous landing run-ins, like when actor Harrison Ford landed on the wrong runway in February of this year.
As well, this tool will recognize different dialects or accents. Although English is the native language in aviation, someone's dialect or accent may make it difficult to understand verbal instructions. Our speech recognition technology will be able to distinguish what people are saying, despite their dialect or accent, and transcribe it. This will save a pilot time and help eliminate confusion.
Mic'dUp: Mapping Out Noises in the Cockpit
Some other exciting exploratory work we're doing with AI has to do with cockpit sounds. It involves installing a microphone into the cockpit, and understanding the different sounds every switch or button makes. This information is incredibly useful to investigators as they determine what has happened in a crash.
After an aircraft incident, regulatory partners, airlines and others listen to the flight recorder - or black box - to try to piece together the timeline of events.
However, these recordings can often be incredibly difficult to understand, because of the background noise. With the help of AI and machine learning, we could play that recording and identify every button pressed and when it happened.
Light Detection and Ranging: Scanning the Air
Finally, my team is also using a technology called light detection and ranging (LiDAR), which autonomous car companies are also using to detect and avoid surrounding vehicles and other obstructions. LiDAR uses a laser to scan and map out approaching shapes to determine their location. Why is this interesting? Well, we are using this laser to measure the condition of the air ahead of an aircraft. It essentially sees upcoming external environments, and then calculates how that situation could impact various aircraft systems.
For example, if a plane were to fly by a volcano, the engine could ingest ash. Once ingested, the ash could transform into glass and cause significant damage.
The laser is able to study the approaching air quality, so a pilot can tell if he or she is flying into ash, ice or other detrimental conditions.
Also, blending this LiDAR technology with our existing weather solutions such as Connected Weather Radar can help pilots make the safest decision based on surrounding weather patterns and help them steer clear of any dangerous or severe weather.