The development of autonomous weapons technologies in the military domain is being heralded by academics and analysts as the third revolution of warfare, with rapidly increasing autonomy in weapons well underway.
Australia is in a group of countries leading the charge along with the United States, Britain, Russia, China, Israel, India and South Korea.
A range of aerial, land and under-water systems with autonomous capabilities are being developed and deployed, including in current conflicts. Weapons that would operate without human control over the selection of targets and decision to attack, referred to as fully autonomous weapons or “killer robots”, are only a few steps away from current reality.
Such weapons pose legal, ethical and security risks as decisions of who to kill are delegated to machines. Australia is, however, embracing this rapidly developing industry with barely any public discussion.
Australian company DefendTex provided 300 of its Drone 40 loitering munitions to Ukraine last year. There have been mainstream media reports on the use of drones, loitering munitions and other autonomous capabilities in Ukraine, including in the New York Times and Financial Times, and how the conflict has become a vital testing ground for Western weapons.
This is mostly framed as a positive development with only limited criticism.
DefendTex is just one of many Australian companies in an expanding landscape of Australian development in AI for defence, involving private industry and universities. To foster collaboration between these sectors, from 2017 as part of its Next Generation Technologies Fund, the former Coalition government established Defence Cooperative Research Centres.
The inaugural centre, Trusted Autonomous Systems, based in Brisbane, was awarded an initial $50 million investment from the government for its first seven years.
Universities are active in projects facilitated by Trusted Autonomous Systems, as well as other initiatives in partnership with Australian defence or arms companies that focus on autonomy and related capabilities. For instance, both the Defence Science Institute which connects defence, industry and Victorian universities and the New South Wales Defence Innovation Network have autonomy as a focus area.
STELaRLab is a partnership between the University of Melbourne and controversial international arms company, Lockheed Martin, which has been accused of complicity in war crimes.
A core area of work for STELaRLab is autonomy, robotics research and development. A student activist group in Australia, Lockout Lockheed Martin, protested against the partnership involving university collaboration in weapons’ production.
Given the nature of autonomy-related technologies, it’s hard for university students to know the end use of their projects or research when these are in collaboration with arms companies or defence.
Some students who are concerned about these ethical risks will turn down opportunities to work on projects connected to defence, while others lack awareness and ethics education in fields such as computer science.
Australian companies on the frontline
Large arms manufacturers, as well as smaller Australian arms companies, undertake collaborative projects supported by Trusted Autonomous Systems, many pushing the autonomy envelope.
DefendTex, the creators of the Drone 40 loitering munition being supplied to Ukraine, is developing a range of capabilities, including swarming technology. A swarm is where numerous weapons are deployed in a connected group, moving en masse as they find targets. Without limits to their geographic area and duration of operating, these would be difficult for operators to adequately control and increase risks to civilians as well as intensifying the pace of warfare.
Cyborg Dynamics and Skyborne Technologies are two Australian arms companies based in Queensland developing weapons which skirt moral and ethical red lines, in the absence of any specific limits on autonomy.
Skyborne Technologies is developing Cerberus GLH, an autonomous drone carried in a backpack and equipped with rapid multi-shot grenades. Cyborg Dynamics is co-developing the Warfighter Unmanned Ground Vehicle with Australian robotics company BIA5, also armed with various ammunitions. These weapons have been exhibited by both companies at Australian and international arms fairs and an upcoming US convention for military and industry.
To avoid disastrous consequences, particularly for civilians, these weapons must be controlled by human operators who can understand and evaluate the environment and not be unleashed in conflict zones which are increasingly urban areas. This is most crucial in decisions for targeting and whether or not to attack.
The above two companies also share a collaborative venture in Athena AI which focuses on the development of an AI-enabled targeting system to track, identify and select targets. These targeting capabilities can be integrated into other weapons and are a dangerous step towards fully autonomous weapons.
Athena AI capabilities are being utilised by Red Cat which makes drones for the US used for the protection of military bases and border control.
There are risks in exporting systems, components or software for use by other companies or countries, especially as they may be used or adapted in new ways that are not lawful. It has not yet been clearly delineated what use of autonomy in weapons are legally and morally acceptable.
When this concern was put to the Anthony Albanese government in Questions on Notice in April, the written response on behalf of the Minister for Defence Richard Marles avoided confirming whether or not Australian arms companies or projects from Trusted Autonomous Systems were being exported and asserted that: “There is no widely agreed definition of autonomous weapons, and Defence exports a range of goods and technologies including for training and operations”.
There are some significant projects where Australian defence has partnered with large arms companies. Recently, a partnership was agreed for autonomous submarines between the Royal Australian Navy and Anduril, a key collaborator for the US military, also providing autonomous weaponry to Ukraine.
Anduril founder, Palmer Luckey, made news for developing a VR headset that would kill its wearer if they died in a video game. Anduril expanded to Australia last year, with the submarines central to that venture.
“Ghost Bat” is an autonomous aircraft developed in a flagship project by Boeing Australia with the Royal Australian Air Force. Boeing is a multinational aerospace company that works in civil areas, such as commercial aircraft, and communication satellites, as well as defence. The company sells equipment to repressive states, such as Saudi Arabia.
The Ghost Bat project has facilitated the establishment of a Boeing manufacturing facility near Toowoomba, in Queensland, as part of the Wellcamp Aerospace and Defence Precinct. The site is the first Boeing manufacturing site outside of the US. The aerospace hub currently centres on the Ghost Bat contract with the Australian government.
When the hub was announced in 2021, Queensland Treasurer Cameron Dick said: “Our vision for this precinct is to be the epicentre of aerospace and defence development, advanced manufacturing, globally, research and development and education.”
Autonomy is integral to innovation in aerospace in defence and civil domains but currently there is an absence of regulation in these areas. To innovate responsibly, clearer guardrails are needed from government and within the private sector.
AUKUS embraces AI
The innovation in autonomy and investment in AI for defence is shared by Australia’s allies, especially the US and Britain. Cooperation between these countries on autonomous capabilities is set to increase through the AUKUS security alliance.
The alliance was announced to foster cooperation for regional security between these partners but has received criticism in Australia from analysts, academics, past government leaders including former Prime Ministers Paul Keating and Malcolm Turnbull and the public, including the Australian Anti-AUKUS coalition.
AUKUS’s next phase, or “second pillar”, focuses on technology sharing of “advanced capabilities”, a major aspect being AI. Cooperation on advanced capabilities is intended to increase security and the ability of the three partners and their defence forces to work together. The advanced capabilities pillar was recently showcased in a joint “autonomy trial” hosted by Britain, in collaboration with all three militaries at the testing. This is just the outset of collaboration on AI-enabled capabilities through the alliance.
Recently, the Australian government announced a new scheme, the Advanced Strategic Capabilities Accelerator, with autonomy as a priority area. This was in response to the recent Defence Strategic Review’s findings and AUKUS’ second pillar.
The Defence Strategic Review only contained one other reference to autonomy — a general one to air capabilities and specifically Ghost Bat. Given the extensive landscape of development in autonomy for defence, the limited references seem unusual, as it is repeatedly articulated as a priority.
By contrast, the 2020 Defence Strategic Update outlined how “emerging technology will be rapidly utilised and incorporated into the new strategic framework, with autonomous weapon systems and long-range weapons being increasingly developed, researched and tested”.
The last few years have illustrated the pursuit of this. Deputy Prime Minister and defence minister Richard Marles recently commented on the Advanced Strategic Capabilities Accelerator and said: “Australia must invest in the transition to new and innovative technologies for our Defence Force.”
Autonomy is seen as central to these goals.
Question of ethics
Some of the ethical concerns of developing this area have been recognised in a paper on AI ethics in defence, commissioned in 2021 by the Australian Department of Defence.
It proposed three different tools: an Ethical AI for Defence Checklist, Ethical AI Risk Matrix, and a Legal and Ethical Assurance Program Plan, but these do not reflect current defence policy.
The government uses a framework for the development of all weapons called the “System of Control”. This framework does not have specific considerations related to autonomous capabilities in the weapons design or how the weapon operates. Policy is lacking on limits to how autonomy is used in weapons and the amount of human control required, in particular over the “critical functions” of selecting targets and deciding to attack.
Without the establishment of clear policy, development is unfettered. The legal, ethical and security risks are not being adequately addressed.
Leading Australian AI experts in 2017 called on the government to support a ban on lethal autonomous weapons. An open letter for global AI and robotics researchers and companies included high profile endorsements such as Elon Musk and American tech entrepreneur Steve Wozniak.
Recently, the Australian Human Rights Commissioner urged the prohibition of lethal autonomous weapons. Australia is yet to heed such calls, which are echoed globally.
In response to the many legal, ethical, security and humanitarian concerns raised by autonomous weapons, the international community has called for new international law to be established. This includes the United Nations Secretary-General, the International Committee of the Red Cross, the tech sector, AI experts and the Stop Killer Robots campaign.
The UN Secretary General’s New Agenda for Peace calls for negotiations on a new legally binding instrument to address autonomous weapons to conclude by 2026.
A legally binding international instrument on autonomous weapons would establish specific prohibitions and other obligations. This may include prohibitions on weapons that select and apply force to targets without human control, or obligations regarding the duration of time and geographical space where a weapon with autonomous capabilities is used.
These regulations would seek to address the legal challenges of accountability and international humanitarian law, such as ensuring distinction between combatants and civilians and the proportionality of an attack. These require inherently human evaluations and cannot be achieved by a machine.
It would also establish a strong precedent for responding to ethical concerns, notably the delegation of life-death decisions to machines and digital dehumanisation.
Digital dehumanisation is the process whereby humans are reduced to data, which is then used in automated decisions that may have negative effects. Autonomous weapons that decide to attack and kill illustrate the most acute harms. Such regulation would also mitigate security risks including the acceleration and intensification of conflict due to the potential pace and scale of these weapons or machine error.
International talks have sought to address autonomous weapons for almost a decade. Since 2014, dedicated diplomatic meetings have been held each year at the United Nations in Geneva under the framework of the Convention on Certain Conventional Weapons (CCW).
To date, more than 90 countries have called for new international law to be established. However, countries leading in AI development for their military — especially the US, Russia, Israel and India — have opposed any regulation.
Due to consensus rules, this has prevented the process from advancing towards any kind of concrete action. A stymied process in these diplomatic meetings mirrors past international disarmament processes, such as on landmines, where eventually international law was established in line with global momentum.
Australia, along with the US and Britain reject the need for new international law on autonomous weapons. These governments have often acted in a group at the diplomatic meetings, along with Canada, Japan and South Korea, offering proposals which disregard ethical concerns and obfuscate human control.
Since the Albanese Labor government was elected in May last year, Australia has started to participate more constructively, engaging with ideas presented by other countries. However, it insists that any measures must not reflect a legal obligation.
In previous cases, such as with landmines, Australia’s position diverged most notably from the US to join global efforts to establish a new treaty when Australia signed the Mine Ban Treaty in 1997, while the US did not.
Australia is currently out of step with global progress in favour of new international law to address autonomous weapons. But momentum is building towards a new legally binding instrument on autonomous weapons.
Last year, at the United Nations General Assembly, a multilateral joint statement on autonomous weapons was delivered. This was the first time for such engagement outside of the continually stagnated UN Convention on Certain Conventional Weapons (CCW) meetings.
This year a number of regional conferences, hosted by countries including the Netherlands, Luxembourg and Costa Rica for the Latin America and Caribbean region, have been organised. These were in addition to the continued but still stymied CCW meetings.
The United Nations General Assembly is also approaching and it is likely many countries will use the opportunity to take further actions to address autonomous weapons such as advancing a resolution.
Policy and international law typically trail behind the advent of new technology. But decisions are always required around what can and should be pursued for humanity’s betterment. Autonomous weapons are no different.
[Matilda Byrne is the National Coordinator of the Australian Stop Killer Robots campaign. This article was first published by Declassified Australia and has been reprinted with permission.]