Submitted by BSR
By: Dunstan Allison-Hope, Vice President, BSR
This blog is the second in a series of two about human rights assessments in the technology industry. The previous blog described challenges; this blog proposes solutions.
Previously, we published a blog post describing the challenges that arise when undertaking human rights assessments in the technology industry, such as scale, uncertainty, and the role of the user in shaping impact. In this blog post, we set out approaches to address those challenges—some that we are implementing already and some that would represent innovations for the field.
1. Human rights by design
As we’ve previously proposed, a human rights by design approach would bring insights from a range of professional communities—business and human rights teams, product managers, research and design teams, and sales and marketing teams—to fully integrate human rights considerations into the design, development, and sale of new products, services, and technologies. It would enhance the product design process by ensuring that respect for human rights is deliberately integrated throughout and that more rights-respecting design choices can be made. Recently, Google took such an approach for their celebrity recognition product, taking a variety of measures prior to product launch—this is one of the best examples we’ve seen thus far of leveraging such an approach.
2. Human rights assessments constitute one part of a broader human rights due diligence framework.
It is easy to conflate human rights assessments and human rights due diligence as the same thing, but they are not. As the UN Guiding Principles on Business and Human Rights (UNGPs) clearly state, a human rights assessment is only one part of a human rights due diligence framework, which should also include integrating the results of assessments into decision-making, tracking the effectiveness of responses to assessments, and communicating how impacts are addressed.
It is easy to conflate human rights assessments and human rights due diligence as the same thing, but they are not.
These elements take on special significance in the technology industry, where human rights impacts can change over time as the real-world use of a product, service, or technology takes hold. Methods may include providing channels to report product misuse, pinpointing data trends that may signify a problem, and communicating revised thinking over time. The Facebook Oversight Board is an excellent example of a human rights assessment being one part of a broader due diligence framework. Beyond social media platforms, channels for identifying and reporting product misuse (for example, when a facial recognition system is leading to discriminatory outcomes) seem underdeveloped.
3. Sector-wide human rights assessments.
As we continue to address the human rights impacts arising from the technology industry, we’ve come to believe that one important constituency needs to participate much more actively: the “non-technology” companies integrating technology into their business operations, strategies, and plans. As we’ve previously written, dialogue about technology and human rights risks being too focused on the technology itself, with insufficient attention given to the companies deploying it. One solution we propose is the completion of sector-wide human rights impact assessments for the industries using technology—such as financial services, healthcare, transportation, retail, and law enforcement—to provide companies and policy makers with actionable recommendations on how human rights impacts arising from technology use can be avoided, prevented, and mitigated by entire value chains acting in collaboration. This would also help address the need for system-wide approaches, another of the challenges we raised in our previous blog post.
4. More engagement with vulnerable users.
The UNGPs make clear that human rights assessments should involve meaningful consultation with potentially affected groups and pay special attention to human rights impacts on individuals from groups or populations that may be at heightened risk of vulnerability or marginalization. In the technology industry, companies identify “personas” to represent the different user types that might use a product, service, or technology and design with their needs in mind. Human rights assessments would benefit from the more deliberate identification of “personas” from a much more diverse range of backgrounds as well as engagement with real potential users to understand how their rights may be impacted. This would require more deliberate collaboration between human rights and product research teams.
The UNGPs make clear that human rights assessments should involve meaningful consultation with potentially affected groups and pay special attention to human rights impacts on individuals from groups or populations that may be at heightened risk of vulnerability or marginalization.
5. Integrating futures methodology into human rights due diligence.
Futures thinking, also known as strategic foresight, provides a set of tools for companies to address rapid change, uncertainty, and complexity—the very same characteristics that make human rights assessments in the technology industry so challenging. At BSR, we are experimenting with the use of these tools in human rights assessments as a method of recognizing potential nefarious uses of technology that we might otherwise miss, identifying the human rights impacts associated with these cases, and putting in place measures to address them. Early pilots have been very promising, and companies we’ve worked with have found that uncovering blind spots and broadening horizons enable more informed decision-making and help prepare them for an uncertain future. We believe there is potential to use futures thinking much more than we do today and to broaden participation to stakeholders outside the company.
6. Assessments that inform industry standards and policy, legal, and regulatory frameworks.
Many of the challenges we described in our prior post cannot be addressed by responsible companies acting alone. Examples of this include situations in which fewer rights-respecting companies step in to provide a service where other companies refuse to, when users disregard product terms of service, or if human rights risks are system-wide in nature. In these cases, human rights assessments are clearly inadequate in isolation, and we need more comprehensive approaches to eliminate human rights violations. However, by systematically identifying adverse impacts, we do believe that human rights assessments can provide excellent insights to inform the creation of standards, policies, and regulations. For this reason, we greatly welcome companies that publish human rights assessments as an input into a broader policy dialogue—the audience for human rights assessments should not be restricted to the company alone.
We hope these six ideas, each informed by real-life engagements with companies, help enhance discussions about how to improve the implementation of the UNGPs in the technology industry. However, these ideas also rely on companies across the whole industry—not just the leading companies—undertaking human rights assessments and implementing human rights due diligence frameworks, and for this reason, we’re also intrigued by increased calls for mandatory human rights due diligence. It is only when human rights due diligence is the norm and not the exception that these solutions will truly take hold.
Originally appeared on BSR.
Since 1992, Business for Social Responsibility (BSR) has been providing socially responsible business solutions to many of the world's leading corporations. Headquartered in San Francisco, with offices in Europe and China, BSR is a nonprofit business association that serves its 250 member companies and other Global 1000 enterprises. Through advisory services, convenings and research, BSR works with corporations and concerned stakeholders of all types to create a more just and sustainable global economy. For more information, visit www.bsr.org.
More from BSR