If you were looking for a single statement to sum up the state of technology in the late stages of the COVID-19 pandemic, you’d be hard pressed to do better than this quote from the information hub ESG Investor:
“A highly-profitable, innovative and far-reaching sector operating in a morally grey area, practising behaviours that aren’t technically illegal, but are arguably unethical.”
So, what, exactly, does ‘arguably unethical’ mean? For some it is the way technology companies use the data of their users, where an unclear or hidden ‘consent’ can open up all manner of possibilities for the use and sharing of personal user data. For others, it is the unfettered use of technology platforms to spread hate. And for some it is the use of digital technologies to plan and execute disruptive activities in the physical world.
Big tech is well aware of these concerns, as digital rights are squarely on the agenda of human rights focused investors and activist-led activities in the tech space. Yet Google and Facebook, among others, have a reputation of dragging their feet when it comes to being transparent on their approaches to following the rules related to human rights concerns within their business models.
In a paper titled The ESG Imperative for Technology Companies, KPMG discovered that though 57% of technology CEOs acknowledge they “must look beyond purely financial growth to achieve long-term sustainable growth”, most identify climate change – not human rights – as the biggest sustainability risk to their businesses. The response from technology companies to the breadth of their ESG risks is at best, inconsistent and too often it is absent altogether. KPMG sums it up this way: “Heightened awareness and appreciation of ESG issues has not fully translated into business practices yet.” If we pit this realization against the fact that addressing human rights risks in business models is a key focus area of the B-Tech Project (a feature of the UN Office of the High Commissioner of Human Rights which focuses on implementing the UN Guiding Principles on Business and Human Rights in technology) we see that there is a resounding disconnect.
The business model risks: we need inside-out solutions for inside-out problems
Ranking Digital Rights (RDR), which works to promote freedom of expression and privacy on the internet, noted in the release of their most recent 2020 Corporate Accountability Index that none of the companies they rank “which collectively provide information and communications services to billions of people around the world – came even close to earning a passing grade on [the RDR’s] international human rights-based standards of transparency and accountability.”
For responsible investors, this indictment is doubly troubling. Technology companies are prominent holdings in many responsible funds, largely owing to their favourable response to environmental risks. And, over the last year of COVID-19 in particular, those holdings have delivered exceptional returns. But what of the human rights-centred risks inherent in the business models of big tech? Unmanaged, these risks threaten company sustainability efforts by potentially undercutting the very principles of civil liberties, freedom and democracy from which technology companies benefit.
We are at a juncture. If we are to remain committed to seeing our investees in the technology space work towards sustainable value creation, it is incumbent on us as investors to hold tech players accountable on how they manage the very real human rights and digital rights risks they face. We must now grapple with the reality that these risks are central to – and deeply embedded in – the very business models that technology companies rely on for the success that has financially benefited investors. It is not acceptable to simply turn a blind eye to the negative social implications of this vast wealth creation.
With great influence comes great responsibility
The unique challenge and opportunity for investors is that we innately approach technological products and services from different vantage points. Because we are all users of technology, we are all also vulnerable to the very risks we are seeking to address – not just as investors, but as consumers, too. That means acknowledging the investment benefits of technology, and the positive uses and applications stemming from increased access to the internet and digital platforms, while at the same time understanding the price we’re paying for that access. We need only look to the past year alone to see how human rights risks are materializing and having real implications for society. Instances like the January 6 attack in the US, the #stophateforprofit campaign initiated against Facebook, concerns about the use of tech to facilitate mass surveillance, and the ethical issues around artificial intelligence, are but a few examples.
The point is, notwithstanding the noble intentions upon which today’s technology leaders were founded, or the undeniable social benefits of technology, the good does not negate the bad.
We are in a strong position to leverage our knowledge as users and our influence as investors to have conversations with tech players about mitigating the negative social impacts of their businesses. It is incumbent on us to push companies for the disclosure and transparency needed for all stakeholders to better understand if and how companies are managing these risks.
We believe an important starting point as investors is making clear our expectation for more robust human rights oversight mechanisms at technology and telecommunication companies. At NEI, we are advocating for robust human rights governance structures that are adaptable to meet the ever-evolving state of play in the industry. We encourage other investors to consider what steps they can take in advancing these dialogues. Investors may seek to do this through solo engagements, group collaborations, coalitions like the Investor Alliance for Human Rights, proxy voting processes or filing proposals. Divestment may be an option if it enhances the leverage of remaining investors, but for us at NEI, we are not ready to give up our seat at the table to push for progress. We believe our voice, and the voices of as many investors as possible, need to be heard in the boardrooms of big tech. As investors we have a responsibility to hold technology companies accountable, and we should be leaning into our ability to push for needed transparency on human rights risks, which can no longer be seen as optional.
We can’t afford to sleep on this. The stakes are too high. The risks are too great. And the impacts too broad.