HomeNewsTrendsHow to solve the problem of deepfake pornography? Code Dependent author Madhumita Murgia says regulation is key

How to solve the problem of deepfake pornography? Code Dependent author Madhumita Murgia says regulation is key

'Code Dependent' author Madhumita Murgia explains that regulation will have to be a big part of how we respond to deepfake photos and videos globally. Plus, the responsibility must be shared between governments and tech companies - with the latter holding much of the data, infrastructure and knowhow around AI systems.

July 08, 2024 / 18:04 IST
Story continues below Advertisement
Madhumita Murgia is AI editor at the Financial Times. Her first book, 'Code Dependent', looks at examples of AI's uses and harms around the world, in segments from healthcare to deepfake videos. (Image via Instagram/Madhumita Murgia)
Madhumita Murgia is AI editor at the Financial Times. Her first book, 'Code Dependent', looks at examples of AI's uses and harms around the world, in segments from healthcare to deepfake videos. (Image via Instagram/Madhumita Murgia)

"The entire promise of AI (artificial intelligence) is that it is going to be superior to humans, right? The point of building these systems is that we need to augment our own intelligence or that it is able to correct flaws in human decisions," Madhumita Murgia says over a video call. Her book 'Code Dependent' was shortlist for the Women's Prize for Non-fiction.

Murgia, the AI editor at 'Financial Times', drew on her experience of reporting on tech for 'Code Dependent', her first book. In the video interview from London, where she is now based, Murgia spoke about the problem of deepfakes, why AI isn't inherently a force for good or evil and how AI is as dependent on humans. Edited excerpts:

Story continues below Advertisement

Why is the book called Code Dependent?

It's a pun on co-dependency. We think a lot about how our lives are going to be changed by AI systems, how we are becoming more dependent on technology, whether it is from the social media era to now as it is getting more and more automated, as AI replaces human decision-making and human creativity, as we are seeing with generative AI. But I also have found in over a decade of reporting on this that these systems have baked-in biases, baked-in perspectives on the world which are really designed by the people who build them (the AI systems), and often those people come from a very small octave, a bubble, which is essentially California or San Francisco, in particular, Silicon Valley. So these AI systems only reflect the views of some of us. So just as we are dependent on AI, AI is also fully dependent on us in order to be trained, in order to be more broadly reflective. I wanted to show that through the examples and stories in the book. I also wanted it to be global, to show that the technology is ubiquitous and how it is affecting people all over the world.