Screengrab of task of a submarine
Part of the survey asks people to design an autonomous submarine to clear mines. Source: UNSW Canberra

There are many questions about ethical aspects of autonomous systems in the military. Research by Dr Christine Boshuijzen-van Burken is providing answers. 

Ethics begin before design 

The core purpose of Dr Boshuijzen-van Burken’s research is to better understand the values that designers of autonomous technology should consider before they begin to design systems for Australia’s defence.  

Designing with an eye on ethical values will lead to safer, more cost-effective technologies that are far more likely to earn societal acceptance. 

“Some people believe the military is about killing people. I don’t think that’s what the military is about,” Dr Boshuijzen-van Burken said.  

“The military’s function is to promote justice, security and peace, and in some very rare circumstances this means that lethal force must be applied.” 

It’s not the case that values-sensitive military autonomous technology is more humane; it is a matter of it being “less inhumane”.  

“The designs take into account legitimate targets and collateral damage,” she said.  

“Soldiers don’t want to work for an organisation that uses technologies that might be harmful to them or excessively harmful to others. We have to think about what society is concerned about, because it's also from where the military draws its recruits.” 

The research project

The interactive survey gives the participants a taste for designing ethical and autonomous military technology. It also offers researchers a deeper and more personal level of insight into values that Australians consider important. 

The survey asks respondents to design a drone for dropping bombs on targets, and an autonomous submarine for clearing mines. What makes it most effective as a research tool, Dr Boshuijzen-van Burken said, is the fact that it is essentially a simplified design game, and it’s for anybody and everybody. 

“The survey gives participants a design brief, then a series of design choices," she said.  

“There’s a cost meter showing the impact of the choices they make. Each design choice will affect the meter. When the meter is in the red, the costs are too high and the participant has to trade off their design choices.” 

Deciding the ethics of autonomous systems 

Dr Boshuijzen-van Burken said she has a “triple interest” in engineering, philosophy and the military. She holds a PhD degree in ethics and philosophy of technology and has served as a reserve soldier in the Royal Netherlands Reserve Army since 2000.  

The research Dr Boshuijzen-van Burken is conducting aims to build a framework that will assist Defence in the ethical design, development and use of autonomous systems. 

With a strong focus on stakeholder engagement, the research considers much more than just the user. To explain her work, Dr Boshuijzen-van Burken uses the design of electric vehicles (EVs) as an example. 

“When you design a car with only the driver in mind, you miss several extremely important stakeholders,” she said. “Consider some of the not-so-obvious stakeholders, such as cyclists and pedestrians.” 

“Most autonomous and electric vehicles are silent. And this is what we wanted, right? But especially in countries where you have a lot of pedestrians and cyclists on the road, that makes them very dangerous, because you don’t hear them. And what does this mean for blind people?” 

Now, she said, laws have changed in some countries to require EVs to make sounds.  

“It means the cars are no longer being designed with only the driver in mind,” she said.  

“Those countries are no longer missing an important stakeholder, namely those that are around the car, on the street.” 

This is Dr Boshuijzen-van Burken’s goal for defence-based autonomous systems. And it’s vital to get it right, she said, because any method that ignores stakeholders can quickly become a tool of fear, frustration and ridicule.  

Consider the fate of the consumer version of Google Glass smart glasses, for example, when the technology company didn’t take into account the impact of the product on those who felt their privacy would be violated. 

“With everything going on with artificial intelligence and autonomous systems at the moment, people are either extremely positive or they fear it,” Dr Boshuijzen-van Burken said.  

“I hope I can shed light on the topic by using these tools and ways of thinking about technology and apply it to the military. Thinking about the philosophy and ethics of technology is not new. Plato talked about technology.” 

At the back of Dr Boshuijzen-van Burken’s mind throughout the research process has been a particular framework called Reformational philosophy, in which ethics is ever-present. 

“International humanitarian law and the ethics of war state that mosques and churches are protected places in warfare. Likewise, sacred spaces are important to the Indigenous community. But how can an autonomous drone know the land it is flying over is sacred? Should it know?” she said. 

“There is a nice link between the philosophical framework I’m using and autonomous technology.” 

Dr Boshuijzen-van Burken plans to collect the views of a cross-section of Australian society via the survey.  

“I want to probe the average Australian on this. Everyone should take the survey – your niece, your grandmother, your uncle. I want all Australians to become part of this research.”

Anyone wishing to participate in the survey can visit https://defence.wevaluate.io/