As automated systems are evolving, they are becoming more capable of making substantial decisions. Michael Dorneich says allowing the systems to have more responsibility and authority can improve safety and performance because actions are tailored to the particulars of a current situation.
Because of that enormous potential, Dorneich, associate professor of industrial and manufacturing systems engineering, studies how these sophisticated systems and humans interact and how that relationship affects human performance.
“I look at what we call adaptive systems,” he says. “With the ability to assess their own situation and take initiative, these systems can do a lot of good in routine circumstances.”
But the systems have some major features to address. Dorneich explains that while humans are much better at situational awareness than automated systems, there are times when operators are inundated with so much data that they might not be able to make the best decisions.
“Information automation—a relatively new field that in theory delivers the information an operator would need to know given a certain situation—is adding a new dimension to human factors engineering,” he says. “We have to figure out how to study and measure things that are a bit more complex. For instance, we might have to determine how compelling a system is—does the way it looks or shares information foster too much trust?”
Add to that the possibility that these systems may have capabilities that a person might not realize, and Dorneich has an interesting research opportunity on his hands.
He’s exploring information automation with the Federal Aviation Administration project “Characterization of Flight Deck Information Automation Issues.”
The project will look at different platforms that are saturated with data and information. He will help the FAA develop an understanding of information automation systems and determine what major issues operators could possibly experience.
“Anything in avionics has to be certified very vigorously, but there are other technologies, like flight bags and laptops that also contain information that pilots use,” Dorneich says. “With so many new systems that have different levels of reliability and accuracy, the FAA needs to know the right questions to ask when evaluating a new technology for certification.”
He’s hoping to help develop a negotiated set of metrics that will help the FAA find a means of compliance for new technologies and a way to measure it.
The work requires an understanding of trust—both the risks of over-trust and becoming complacent as well as under-trust and not fully utilizing a technology—and the workload demands that information automation can place on humans.
“We have to look at the impact of automating all the easy, routine tasks and ask ourselves if humans will be able to manage the complex tasks that remain,” Dorneich says, adding that automation can sometimes make things more complicated.
“When you are dealing with automated systems, operators might not be fully aware of what the system is doing, and a lot of times that work is being done silently, behind the scenes. If the automation reaches its limit and kicks control back to the operator, the person might not have enough understanding of the situation to immediately recover.”
Dorneich will be doing evaluations of information automation this year and is thinking about building a flight simulator for teaching and research in collaboration with the Department of Aerospace Engineering to provide somewhere to further assess his work.
Long term, he wants to take his ideas from theoretical to practical, with the idea that once characterizing and measuring awareness to different types of automation is possible, designers can work these ideas into product development.
The insight from this type of research could also make products safer by implementing mechanisms that prevent a miscommunication between systems and humans that could cause an accident.
“What we’re really trying to do is create automation for computer systems or similar technologies that are tailored to individual needs based on a situation rather than designing something that will fit everyone at all times,” he explains. “We want information to be tailored, reactive, and adaptive to increase efficiency and safety. To get there, we need to understand the human factors involved.”