Artificial intelligence (AI) is rapidly permeating nearly every aspect of our lives, from the obvious apps and virtual assistants to more subtle deployments in fields like healthcare, education and scientific research. But who gets to decide how this powerful technology evolves and what principles should guide its development? According to Verity Harding, author of the new book AI Needs You: How We Can Change AI’s Future and Save Our Own, it can’t just be the technologists and companies building AI systems.
“I’m very keen that the people building the technology and who understand the technology are at the table in discussing what it would look like. But it can’t only be us,” Harding said in a recent Talks @ Google discussion. “There are life experiences that we don’t have.”
Harding believes AI’s increasing societal impact means everyone needs a voice in determining its trajectory, not just experts.
“What I mean by that is I’m very keen, obviously, based on my career, that the people building the technology and who understand the technology are at the table in discussing what it would look like. But it can’t only be us,” she said.
In her book, Harding examines historical examples like the dawn of IVF and embryology research, the space race and the early internet to extract lessons about governing transformative technologies in the public interest. A former adviser to the UK’s deputy prime minister and head of policy for AI company DeepMind, Harding advocates establishing “guardrails” through a democratic process:
“I think as technologists, I think we should embrace limited guardrails because I think that does allow innovation to flourish,” said Harding. “This is new. We don’t know the answer. Very similar to AI. They said, it also raises a lot of philosophical questions.”
But she stresses that specialized technical knowledge shouldn’t be a prerequisite for weighing in.
“You don’t need to understand deeply and technically exactly how AlphaFold or Gemini works to have a feeling that you…are really excited and want to see more AI in science because you think it will help make progress,” she said.
Harding believes AI developers want to hear a wide range of perspectives.
“I think when I was at DeepMind and at Google, we always wanted to know what people thought about the technology we were building,” she noted.
In the end, she sees genuine public participation as key to ensuring AI Systems respect society’s values and ethical boundaries.
“Ultimately, the thing I really know about politics is the politicians are guided by what — to a large extent about what their voters think. That’s what they’re there to do,” she said.
For Harding, it’s not about stopping or slowing AI’s progress, but democratically shaping it: “I want those experts around the table. I want those experts involved…But there are levels at which the debate will happen. And what I want people to be encouraged by and empowered by is that you don’t need to understand deeply and technically exactly how it all works to have an opinion.”
Featured image: Credit: Google