Breaking research

How can we ensure Artificial Intelligence is not destructive to society?

New podcast now available featuring researcher and Chief Scientist at AIBrain, Steve Omohundro

So you designed a superintelligent machine which has the goal of winning a game of chess. Sounds fairly straightforward. However, the machine has developed its own ‘subgoals’ to ensure it doesn’t lose. It can resist being turned off: a machine that’s turned off cannot win at chess. Suddenly, you have a rogue AI on your hands which you can’t switch off.

This is just one example of the type of ethical considerations that need to be in place when it comes to artificial intelligence (AI) research, which is what the latest episode of the How Researchers Changed the World podcast explores, with AI researcher Steve Omohundro.

“Should these systems be allowed to vote? Should they be full citizens? Should they be viewed as servants? Should they be viewed as slaves? Are they just machines?” – Steve Omohundro

In recent years there has been a surge of development and investment in the field of artificial intelligence. Google bought DeepMind when it was a small start-up AI company, for £400 million. In 2016, their AI machine beat a top human player at the incredibly complex game ‘Go’. AI experts didn’t think this would be possible for a long time.

These major breakthroughs and advances in the technology mean that we need to address ethical questions about AI right now. What makes it even more important is the current political environment where several countries are battling to be the first to develop AI. China, for instance, has declared an aim to be the world leader in the field by 2025. The pressure is on for researchers, who must ensure that these machines will not be destructive to society.

Listen to the latest episode of the How Researchers Changed the World podcast to find out more. The podcast is available on Apple Podcasts, Spotify, Stitcher and Android podcast providers – or head to

For more information, or to interview Steve Omohundro, please contact: