Are You Enabling Surveillance Capitalism in Your Portfolio?

From the moment you wake up in the morning until you go to bed at night, companies of all sorts are looking to collect and monetize information about you, ranging from your whereabouts to your shopping behavior to your email habits.

Welcome to the world of surveillance capitalism. It’s a term every investor should know, because technologies that track consumers can also be deployed in ways that violate human rights and put people in harm’s way.

“To every one of these technologies, there’s a shadow side,” Emma Pullman, capital stewardship officer at the British Columbia Government and Service Employees’ Union, told attendees at the RFK Compass Investor Virtual Summit in October.

Summit attendees got an introduction to surveillance capitalism and some practical tips on how to monitor their portfolios for the human rights risks it poses. Shankar Narayan, a lawyer and advocate for accountability in technology, described surveillance capitalism as the public and private sectors’ increasing collection and sharing of data for surveillance purposes without giving any control to the people whose data they’re harvesting.

Regulators are largely absent, often allowing companies to make their own decisions about how to profit from the information they gather. “There’s really just a ‘Wild West’ space here,” said Narayan, co-founder of MIRA!, a new campaign aimed at promoting accountability in the tech world.

Artificial intelligence (AI) powered by companies’ large-scale data collections can “make important decisions about virtually every aspect of our lives,” Narayan explained. It can affect how we’re policed, whether we get a job, or how our performance is evaluated once we’re hired, he said.

Such AI-driven decisions often magnify and reinforce existing inequities in our society by assigning less favorable outcomes to people of color or enabling law enforcement to target certain segments of the population. That makes surveillance capitalism “a really big and important space for us to engage with,” Narayan said.

Pullman talked about how she responded after she realized that a technology one of her union’s portfolio companies sold was being used in harmful ways. The company, Thomson Reuters, received over $70 million in contracts from U.S. Immigration and Customs Enforcement to provide data brokerage services that the agency used to target undocumented immigrants for detention and deportation. The company’s software amassed data from social media, arrest records, utility bills, license plate scans, and other sources. The technology has caused concern because the Trump administration has subjected undocumented immigrants to harsh treatment and abuse, according to reports by Thomson Reuters’ own news division and other news outlets.

Pullman came across one of those reports and decided she needed to better understand the technology and its potential consequences. She said investors need to ask the right questions, such as: How is the technology being utilized? What is its impact on workers and communities, particularly communities of color?

Eventually, the British Columbia Government and Service Employees’ Union sponsored a shareholder proposal asking Thomson Reuters to investigate and report on potential human rights abuses enabled by its contracts with ICE. In June, the proposal won approval from 30 percent of the independent shareholders of Thomson Reuters.

Pullman emphasized the importance of building relationships with other stakeholders and speaking with communities affected by the technologies developed via surveillance capitalism. That kind of due diligence is a critical part of environmental, social, and governance (ESG) analysis.

“Investors and ESG teams really need to step up on this,” she said. “You need to ask whether or not you are enabling the harms of surveillance capitalism anywhere in your portfolio.”