Promoted by Penten
There are two sides to meeting the AI threat. The first is recognising we need to grow Australia’s own AI capability. The second is in ongoing development of AI-specific countermeasures.
The AI Advantage
The AI threat, discussed for decades in science fiction, is finally closer to reality. A demonstration of this future AI threat took place in August this year, in the AlphaDogfight Final Trial.
An F16 pilot, a graduate of the prestigious US Air Force weapons instructor course, known only by his call sign "Banger", competed in a series of simulated dogfights against an AI. Despite Banger's years of experience, the AI won all five encounters.
The AI advantage will be critical for Defence. The same processing speed that allows an AI pilot to dominate a dogfight can also help war fighters navigate the overload of data, access crucial information and make more precise and timely decisions.
We know that machines will soon be able to capture and process information much faster that humans and can tirelessly compute without distraction. They enhance other machines by working together to deliver very precise effects, at altitudes, speeds and depths and with lower footprints and logistics demands than humans.
But having an AI advantage is more than simply over-matching the decision timeliness of a counterpart. Machines are also capable of making mistakes and being deceived. As AI becomes more significant and a key enabler to decision-making and targeting, gaining an advantage is also about developing the knowledge capability to target the systemic weaknesses inherent in machine-thinking and create the capability to actively degrade the adversary's AI.
AI Weaknesses
AI does have some known weaknesses.
Commercially available vision products, for example, are central to many autonomous systems and one of the primary means by which machines orientate, detect obstacles and navigate through complex environments.
Modern Deep Learning based vision systems have proven far more accurate than earlier, feature based approaches, but have been shown to be vulnerable to subtle manipulation.
Small alterations to traffic signs can fool AI systems into misinterpreting them. Researchers demonstrated this by placing stickers on a speed limit sign. This caused a driverless car to misinterpret it as a stop sign and halt the vehicle in the middle of the road.
Real systems do, of course, use multiple sensors, but some of these can also be fooled. The University of Michigan, for instance, made a Light Detection and Ranging (LiDAR) autonomous driving sensor system believe a non-existent object had appeared in front of it and triggered its collision avoidance.
Camouflage and decoys
By taking advantage of known AI weaknesses we can reduce the impact of adversary AI by orientating our camouflage priorities to the AI threat.
Consider the sniper. The sniper appears all but invisible to the naked eye. However, they have little protection against other threats, such as the olfactory sensors of sniffer dogs or infrared technology.
The potential AI threat should feature high amongst design and utility considerations in preparing our security forces. Perhaps, to highlight a point, we may see tanks that are painted bright orange in an attempt to fool AI sensors, as human targeting may no longer be perceived to be the most critical threat in near-peer conflict.
The importance of decoys will also be critical to counter the threat. Decoys can provide advantages by amplifying an adversary’s confusion, by overwhelming decision-making engines, or becoming more alluring to an AI-based sensor.
Decoys can also take advantage of the fact that even a highly-capable AI led sensor may be limited by physical characteristics, such as ammunition, and exhaust the strike capacity of the adversary.
These types of limitations and weaknesses are the subject of adversarial Machine Learning research. Adversarial manipulation can be mitigated with interventions such as including adversarial examples in training datasets. Furthermore, systems can be designed with multiple complementary sensor subsystems that are robust to inconsistencies in perception.
We have every reason to believe, as is the case with any influential defence technology, new adversarial methods will be discovered to counter yesterday’s innovation.
Growing Sovereign Counter-AI Capacity
We've previously highlighted the importance of developing a sovereign AI capability. A counter-AI capability is just as important.
There are two sides to meeting the AI threat. The first is recognising we need to grow Australia’sown AI capability to meet adversaries on equal or superior terms. The second is in ongoing development of AI-specific countermeasures, concealment and information management techniques to account for the appearance of AI in an adversary technology stack and exploit the limitations and weaknesses of AI-based technologies.
Of course, as both AI and counter-AI technologies improve, our own robustness to adversary counter-AI must improve as well.
Preparing the Defence Industrial base to meet the demand and opportunities of counter-AI is likely to require focus on a number of enabling areas, such as talent, AI engineering skills, national datasets and greater encouragement for adversarial technology research.
Growing a counter-AI capability will need the people with the same in-demand skills and experiences as those that are desperately needed for AI. Unfortunately, declining rates of engagement in Science, Technology, Engineering and Mathematics (STEM) has resulted in a severe shortage of local talent in these fields.
To attract, build and retain this talent pipeline much support is required. We need to focus on understanding how we can support workforce diversity and early encouragement in access to world-class education and research.
DST Group and Data61 actively support programs to deliver the necessary expertise in sovereign AI algorithm development, critical evaluation and customisation of imported solutions. These programs could lead to research in adversarial technologies, both from the perspective of counter-AI and robustness to adversary counter-AI. Greater investment and participation will be required from other sources however to keep up with the threats.
Another crucial, and sometimes neglected, aspect of AI is the transition from academia to practice. Often military activities involve previously unseen environments, imperfect data and processing limitations. The practical delivery and engineering of AI solutions is as important as the deep theory.
Access to vast sums of relevant data is critical for AI research and development. To develop strong counter-AI,we will also need resources for ongoing adversarial testing of the latest AI technologies to discover their weaknesses.
While it is likely AI will be a major focus area, we will need to discuss how counter-AI can also be supported amongst the competing priorities.
Conclusion
The AI arms race is going to be critical to provide an advantage in the information dominated warfare of the future, but it is not just about out pacing the development in decision-making systems. Investment in developing the capability to target, degrade and counter the adversary's AI is also needed, and arguably just as important as AI creation.
Authors: Ben Whitham, Penten and David Liebowitz, Penten
Penten is an Australian, cyber company focused on innovation in secure mobility, applied artificial intelligence (AI) and tactical communications security.
Penten’s AltoCrypt family of secure mobility solutions enable mobile secure access to classified information for government. This access provides government workers with the accessibility and flexibility of a modern workplace.
Penten’s Applied AI business unit creates realistic decoys using a novel combination of machine learning and artificial intelligence to detect and track sophisticated cyber adversaries.
Tactical Communications Security is Penten’s latest business unit. This capability has adapted Penten’s high assurance secure mobility solutions to create highly complex, but simple to use secure communications technology for the tactical environment.
In 2019 and 2020 Penten was awarded Cyber Business of the Year at the Australian Defence Industry Awards.
For more information visit www.penten.com