Press

Is AI Compatible With Civil Rights?

Amanda White is the Editor of top1000funds.com. She is also responsible for the content across all Conexus Financial’s institutional media and events.

The way the business community is approaching AI is fundamentally incompatible with civil rights, according to Vivienne Ming, co-founder and executive chair of Socos Labs, an independent think-tank that explores the future of human potential.

Ming, who has worked in machine learning for 20 years “long before the hey day of AI in business came about”, said the fact the infrastructure is controlled by so few people distorts the system.

“Such a small number of companies control the infrastructure behind these systems. They are working in the self-interest of a very small number of people,” she told delegates at the RFK Human Rights Compass Investor Conference.

“If not accessible to all of us, then how do you have civil rights? If I don’t have the mechanism to exert civil rights then I don’t have civil rights,” she said, using the example of the inaccessibility of individuals to use their own AI to debate why they should get bail.

“This is not a technology problem, it is technologists dreaming of how good the world could be if everyone used their technology perfectly. I’m not looking at that, I’m looking at the behaviour of humans.”

Ming was part of a panel discussing the governance gap in “big tech” and the role of regulators and the private sector.

It was set in the context of the recent comments by the chief executive of Apple, Tim Cook at the Stanford graduation ceremony where he said: “Too many seem to think that good intentions excuse away harmful outcomes”.

Manny Medina, chief executive of Cyxtera, said the central question was whether big tech is regulated “because there’s no turning back”.

“It is incredible the good that it can do. But this fourth revolution is a bit like the wild west and there is a need for the government. The Achilles’ heel of this revolution is security, it is very difficult. If you can’t secure it and have it be reliable, then it’s substantially more dangerous.”

But Ming had a different view of the role of government regulation.

“Despite being very progressive, I don’t think regulation is a solid solution to this problem. The space is fundamentally dynamic. It’s like a tree has been planted and the river will figure out how to flow about it,” she said. “Static approaches won’t create the deeper change we need. We need to look at the balance of power – there are a very small number of groups that hold all the power in this relationship, if you can shift that around then we’ll see something very different.”

“We need to power consumers. From a civil rights perspective we need to ask what [data] are you willing to share and what you are not willing to share. I don’t think that regulation will solve it and I sure don’t think good intentions/ethics training of engineers will change things. For all the good ethics training has done in the business world! Ethics is different to making a decision in the moment when your path differs from what is good – that is different and you don’t want it from a book.”

Ming’s suggestion of how to understand the problem, and how to solve it, means looking beyond what technology can do.

“The fundamental issue is with the people building the technology,” she said. “A lot of my work in economic equality shows it’s not about opportunity alone, it’s choice itself. If we think people will always make the best choice then we are not talking about human beings anymore.”