Page 11 - Index
P. 11
it happened and what to do about it.
Was it a threat or error? Was knowledge or skill in play?
Data from other sources, especially the training device or
simulator from the inner loop, is then correlated to deter-
mine the competency in question, the severity of the prob-
lem, and the probability it will happen again. Using a task,
threat, and error-and-gap analysis, the safety action group
must then determine the mitigating training deliverable—
how does the operator train pilots going forward so the risk
is minimized as much as possible.
Finally, the group establishes the measures and effec-
tiveness of what the training changes need to be and then
delivers stakeholder reports so the effectiveness can be
constantly monitored.
The most important factor in this process and the reason
tor well versed in facilitation. “What will it take to get into the 90s?” for having all the stakeholders at the table is to provide
This is the positive way to change behavior. Using big data and AI thought diversity—the ability to look at the issue from differ-
helps you motivate pilots to do better. They are not competing with ent perspectives to establish a mitigation that take all factors
other pilots; they are competing with themselves to improve the way into consideration.
they fly. Data can also identify issues with human evaluators by
There are five different buckets or channels of information that paint comparing what the telemetry tells us with human analysis
the picture. While each one is important, the actionable insights come of what happened, why it happened, and what to do about it.
when they are aggregated to tell you what is going on. These channels Let us take a TCAS event as an example. Here we see the
include flight data analysis, Line Operation Safety Audit (LOSA), air instructor graded the pilot with a less-than-satisfactory
safety reports, instructor reports, and simulator telemetry. They are response 1.6% of the time. But the telemetry data, primarily
triggered by an undesired aircraft state (UAS) and help determine what from the simulator, graded the performance less than satis-
happened and why it happened while identifying positive crew behav- factory 20.2% of the time. The big question here is why. Why
iors. Factored into this are the sample frequency and the opportunity is one data source so significantly different from another? By
for bias. We think we are moving to a continuous sample frequency as comparing the data between independent sources, we can
some airlines are already pioneering this. Bias is always there, but some increase confidence in grading quality driven by data rather
media carry greater risk for bias than others. However, by examining all than human judgment.
the information in an aggregate way, bias can be mitigated to provide a
better picture and a complementary perspective of what the informa-
tion from all the different sources is telling us.
Another value proposition is the fact that analyzing data helps opera-
tors compare and benchmark their training program with others. Many
airlines have been audited so it is possible to compare information.
Operators can compare the strengths and weakness of each area of
data. As with putting a learner’s performance into context, this system
puts one operator into context with the rest of the industry, providing
operators motivation to improve.
One fundamental of connecting the inner and outer loops is looking
at the UAS, which comes from the simulator telemetry (CAE Rise),
voluntary safety reports, ISMS, flight data monitoring, or an accident
or incident. Detecting the UAS is the beginning of the process and must
be accompanied by a process to find an effective mitigation. This pro-
cess includes the creation of a safety action group, including the client
and training department or third-party training providers, operations Drilling down a little further found that the instructor
safety, and training program development. The mission of this nominal missed critical information the machine caught. The instruc-
group technique is to develop a consensus on what happened and why tor should have seen the autopilot off, flight director off, and
the flight guidance defaulted to speed mode when the flight
director was selected off. The flight crew should have pitched
the airplane up or down into the green arc.
It is very likely the evaluator saw autopilot off and the
pitching activity by the crew into the green arc on the TCAS
escape guidance or vertical speed indicator. Investigating
more closely, we found that the crew failed to turn the flight
director off and failed to verify that the flight guidance
went into speed mode. This particular aircraft is an Airbus,
and with autothrottle engaged with the flight director, the
airplane did not go into speed mode and the autothrottle
system fought with crew input into the side stick controller
during the TCAS resolution advisory escape maneuver.
(Continued on page 30)
January-March 2022 ISASI Forum • 11