Autonomous systems are the way forward in defence, but while these capabilities are coming along in leaps and bounds, it remains clear that the human controllers of these systems are irreplaceable at this point in time.
To continue reading the rest of this article, please log in.
Create free account to get unlimited news articles and more!
With many viewing autonomy as the future of battlefields, aerospace and marine warfare, the role human interaction plays with these systems ties not only in with their effectiveness, but creates a buffer for the ethical use of these systems.
However, these controllers are "only" human, and artificial intelligence (AI) is also being used to help these operators to make the right decision, in a situation that might be split second and a matter of life or death.
Recently, Defence Connect was joined by Alexander Robinson and Patrick Nolan, representatives from Seeing Machines, a world-leader in driver-machine interaction.
The company uses human factors science to create artificial intelligence technology that surveys what the driver is focused on, and intervenes when necessary.
In this edition of On Point, Nolan, general manager of Seeing Machines, and Robinson, who is the training lead of the company, discuss the capabilities that their work enables.
The impression of the way autonomous systems are advancing, is that everyone's trying to replace the pilot or the driver and get machines to do stuff. Is that the way it works or is it very different?
Nolan - Oh look, I think that's certainly a path that's occurring. But I guess our focus is ... If you look at things like a lot of the autonomous cars and semi-autonomous cars, one of the things that still has to happen, is that if the car has to give control back to the driver, the car needs to know firstly whether the driver's in the seat.
Are they alert? Are they aware? Are they looking at the right thing? So it's our technology certainly in the automotive space that picks up on that stuff, and really then is ... I guess sets up a far more safe process to ensure there's an interaction between the human and the car. And we're doing very similar things in trucks, and to a lesser extent in aeroplanes. It's more around supporting pilot training and those sorts of things.
How do the Seeing Machines utilities work? If something I'm doing is incorrect, or if I'm falling asleep, does it give signals?
Nolan - Basically there's a real time component and a data capture component as well. So if you do have a microsleep in a truck, we'll set off an alarm, we'll shake the seat, and then we'll send real time information back to a dispatch. So they understand the state of the driver or of their multiple crews. But then there's also that ability to actually capture the data so that, I guess, the safety officers can start to see, what are the times where people are more fatigued? How can they support and optimise safety through those mechanisms?
How did this utility cross over to aviation? Was it always planned or did it happen by itself?
Robinson - I joined Seeing Machines about two years ago and I joined into the fleet side of the business. So, I knew that there was an aviation side of the business. I knew Patrick. And the goal in joining Seeing Machines was always to join into the aviation area, which existed when I first learned about Seeing Machines four or five years ago. But it was very early stage. It was R&D, it was prototyping, but it was in an area that I'm very interested in and passionate about, being aviation.
Those initial 18 months in fleet were what Patrick was just mentioning about working on this fleet solution about, how do we sell into customers to help their drivers be safer, help their fleets be safer, help the drivers not fall asleep, help their drivers not be distracted?
When the opportunity came up in aviation about six months ago, we'd been talking about it for a while, but it was one that I'd been across. I'd stayed abreast of what's happening in aviation. I had a good network of former air force colleagues as well as the aviation world, aviation industry and defence industry in general. So it was a great opportunity to take this technology that was proven in other industries and try to bring that across to aviation where the technology was fairly mature, but the problem and the solution fit hadn't yet been completely defined.
Nolan - One of the things that we feel like we're adding a lot of value in is giving the instructors some new data around what the pilot is looking at from the very start. When we first put our kit in, doing work with the likes of Emirates, they saw what we could do with scan behaviour and their comment was, "OK, that's great. We now know where a pilot looks." But the most important thing for them was context. What they wanted to know is, where are they looking when it really matters? Because all of the data that they need is available to them. Are they actually using and engaging with it?
I think firstly to support the next generation of both aircraft and pilots, we have to support the instructors so that they get a much better understanding of the human. And it's really interesting watching scan behaviour change when the workload or the pressure goes up. And certainly in some of the new aircraft it'll be a different type of pressure or a different type of workload than it may have been in the past.
How does the it all work within a simulator and a cockpit?
Robinson - For context, there are other companies out there that are doing eye tracking, face tracking.
But the difference and the value differentiator between Seeing Machines and those other companies is, we've had 18 years of doing this in real world environments, and the real world's challenging.
A laboratory, a controlled environment, it's quite easy to do this. But real world, you've got dust, you've got lighting conditions, you've got head movements, you've got different operators etc, different pilots. Our human factors team helps address that, our engineers help address that, and what that translates to is that sensor in the cockpit about the size of a 20 cent piece, two pen lid infrared ... That's the size of them.
Two pen lids as infrared emitters. They capture the data. They send it to an onboard processor, and then we send that to a server. Now, the server can be the client server, the server can be defence server, it can be our server. That's up to the architecture that the client would like, and we're pretty flexible with that to address whether it's security, whether it's privacy, whatever they need.
The full Defence Connect Podcast with Seeing Machines general manager Patrick Nolan and flight simulation and training lead Alexander Robinson can be found here.