Moral Machines

From intelligent algorithms to driverless cars and autonomous drones

 

machinebanner

Intelligent and autonomous systems are increasingly enhancing and redefining our lives. The applications for these technologies are vast and breathtaking, from self-driving cars and the use of ‘drones’ in disaster and humanitarian efforts to intelligent algorithms as ‘weapons’ against terrorism.

But the rapid development of ever-more-intelligent machines is as controversial as it is exhilarating. The greater the autonomy, the more moral behaviours required, and the more complex the ethical challenges raised.

So what does the future hold? Is it correct to fear an impending “intelligence explosion”, one that, in the words of tech giants Bill Gates and Elon Musk, is “humanity’s biggest existential threat” and has the potential to “destroy us all”?

Integrity 20’16


TUES 25 OCT

MORAL MACHINES
From intelligent algorithms to driverless cars and autonomous drones

“If we don’t get a ban in place, the end point is clear to my colleagues and me: there will be an arms race and it will look much like the dystopian future painted by Hollywood movies like the Terminator series.

The technology will undoubtably fall into the hands of terrorists and rogue nations. These people will have no qualms about removing any safeguards in place on its use. Or using it against us.

Unfortunately, we won’t simply have robots fight robots. Wars today are asymmetric and it will be robots against humans. Any many of those humans will be innocent civilians.

This is a terrifying prospect.”

Toby Walsh is leading researcher in the world in Artificial Intelligence. He was recently named in the inaugural Knowledge Nation 100, the one hundred rockstars of Australia’s digital revolution.

In 2015, Toby helped draft and was one of the initial signatories of an Open Letter calling for a ban on offensive autonomous weapons. The letter was also signed by Stephen Hawking, Elon Musk and Steve Wozniak. In total, the letter now has over 20,000 signatures and has pushed this issue into the world’s spotlight… more

Professor Toby Walsh

Phil Swinsburg is a 24-year veteran of the Australian Army, involved in the creation and deployment of Unmanned Aerial Systems in Iraq and Afghanistan. He is managing director of Unmanned Systems Australia, specialising in the employment of autonomous systems. Phil (and Unmanned Systems) has been at the forefront of civilian autonomous systems development, and was recently involved with Google [X] Project Wing and the commercial delivery of parcels by drones.

Phil Swinsburg

Unmanned Systems Australia

“The history of innovation clearly shows that new technology, particularly disruptive technology has a polarising effect on public opinion. It creates two camps: the optimists and the pessimists, the utopians and the distopians. Much heat is generated as the conflicting narratives do battle for dominance.”

Dr. David Tuffley is a Senior Lecturer in Applied Ethics and SocioTechnical Studies at Griffith University’s School of ICT. A regular contributor to mainstream media on the social impact of technology, David is a recognized expert in his field. Before academia David worked as an IT Consultant in Australia and the United Kingdom, a role he continues to perform when not educating the next generation of IT professionals…more

Dr David Tuffley

Senior Lecturer in Applied Ethics and SocioTechnical Studies, Griffith University

Can robots be ethical?

by Scott Stephens and Waleed Aly, The Minefield

“As robots are increasingly becoming part of modern life…What happens when these robots are forced with making, what we would call now, ethical decisions? Are robots capable of making ethical decisions?”

Scott Stephens is Editor of the ABC’s Religion and Ethics website, and specialist commentator on religion and ethics for ABC radio and television. He is also co-host (with Waleed Aly) of The Minefield on Radio National.

CHAIR: Scott Stephens (Australia)

Editor, Religion and Ethics online and Co-Host, The Minefield, ABC