Every company has a responsibility to discover the human rights implications of their work, according to experts speaking at an MIT conference Friday.
As much as social media platforms have been used for good, they have also been used for bad. Human rights experts speaking at an MIT Sloan Tech Conference Friday said that as technology advances continue, people need to address the role they are having both in society and inside their organization.
Panelist Nicole Karlebach, global head of business and human rights at Verizon, said the company follows the UN Guiding Principles Reporting Framework, which “asks us to think about the most significant human rights issues you’re most likely to incorporate into business issues.”
For Verizon, a global company with 100,000 employees, those are privacy, free expression and the right to be free from discrimination, Karlebach said. With that in mind, the company has established a set of key pillars that include executive commitment, a team dedicated to human rights, grounding international laws into its business and engagement and accountability, she said.
SEE: Dice survey shows tech leaders’ thoughts on race and gender in the workplace (TechRepublic)
“Each company has a responsibility to discover the human rights implications of their work,” said the other panelist, Hannah Darnton, associate director of ethics, technology and human rights at the global nonprofit organization, Business for Social Responsibility.
They also need to identify what their business risks are and how they are impacting society and then figure out how to address them, Darnton said. Some questions they should ask are: How are you designing platforms and services? Have you integrated stakeholder feedback? How are you selling your products and to whom? Are their uses down the road that might adversely affect certain populations?
Hard questions that need to be asked
BSR works with companies to figure out what their risks are and how to foment change, which they should also be tracking, Darnton said. They also need to make those changes transparent, she added.
“We help companies go through that process and identify [and] mitigate the actions which are most salient for you,” she said. “You can create a great product that will advance good things—but also bad.” For example, Facebook helps communities connect and helps people find access to healthcare and education, “but it can also be used to propagate hate … dual use can be a problem and companies have to consider that,” Darnton said.
Another question is, if you’re a company like Salesforce, whose platform is used to manage clients and relationships, what happens if a company Salesforce sells to starts doing terrible things, she said. “Whose responsibility is it? Where does Salesforce plug in? That’s a big question we explore.”
Finally, there is a need for counterbalancing, Darnton noted. “If you try to protect one right that may have a negative impact on another human right.” An example she cited is the use of encryption to protect people and enable them to communicate even in repressive governments.
“Encryption will protect communities and allow us to continue our mission. However [it] may make it difficult to identify child abuse online. So us as providers are protected, but they’re not.”
It is a delicate balance trying to take action while not creating new problems, and the issues are also changing so rapidly, Darnton said.
Programs should be independent of an individual’s viewpoints
Moderator Anat Biletzki, co-founding director of the human rights and technology program at MIT, said she was “intrigued” by the differences that come from working for a for-profit business like Verizon, and for a nonprofit organization. She asked Darnton and Karlebach to discuss how their different roles influence them.
“Embedding human rights within a company means setting up a program and process that sets up consistent decisions that can survive any individual sitting on a team or in a leadership position,” Karlebach said. “The goal of setting up these functions is to set something up that can be enduring” so can it survive different viewpoints and personalities.
Her team at Verizon has “attempted to ground this so heavily,” she added.
Darnton said she comes from a “complementary place.” Being inside a company gives added insights and the chance to understand all the different dynamics, she said.
“We don’t always have insights into politics and dynamics and the inside scoop, if you will.” While BSR will “go really deep on how to implement human rights” and act as the “translator of where both parties are coming from” when helping to implement strategies, it depends on people inside to help them do so, she said.
Karlebach said her team has been engaged with BSR for over a decade and likes being able to test out ideas and share and gain perspectives as a member of its human rights group.
How to do your part
In response to a question from Biletzki about what colleges and universities should teach to educate people on using technology to further human rights, Karlebach advised that they seek out any courses offered, even if they have to audit them.
“Make a connection with a professor. It can round out your education,” she said. Once an individual is at a company, regardless of their role, “there is an opportunity to partner with teams or individuals doing this work” internally, Karlebach said. “Even if HR or AI or ethics isn’t in your title you still have a role to play in helping your organization show up responsibly.”
Darnton agreed, saying “we see this as a muscle people need to learn to flex. They don’t necessarily encounter human rights issues and they don’t know what they look like.” Echoing Karlebach, Darnton said it doesn’t matter what role a person has in their organization or where they are geographically based, they can still champion human rights issues.
“Champion ethics and human rights. You can take small, tiny actions,” she said. When it comes to technology, “You have to address both sides of the coin.”