A Practical Guide for Building Ethical Tech - 5 minutes read


A Practical Guide for Building Ethical Tech

"Techlash," the rising public animosity toward big tech companies and their impacts on society, will continue to define the state of the tech world in 2020. Government leaders, historically the stewards of protecting society from the impacts of new innovations, are becoming exasperated at the inability of traditional policymaking to keep up with the unprecedented speed and scale of technological change. In that governance vacuum, corporate leaders are recognizing a growing crisis of trust with the public. Rising consumer demands and employee activism require more aggressive self-regulation.

In response, some companies are creating new offices or executive positions, such as a chief ethics officer, focused on ensuring that ethical considerations are integrated across product development and deployment. Over the past year, the World Economic Forum has convened these new “ethics executives” from over 40 technology companies from across the world to discuss shared challenges of implementing such a far-reaching and nebulous mandate. These executives are working through some of the most contentious issues in the public eye, and ways to drive cultural change within organizations that pride themselves on their willingness to “move fast and break things.”

From these learnings, we’ve distilled practical advice for corporate leaders to develop an effective approach to mitigating or preventing negative impacts from their products and regain public trust.

While accountability for harmful products often happens at the executive level, decisions that lead to them are often made by engineers and developers on product teams. If you look at the recent tech scandals, from discriminatory advertising to proliferating hate speech, most of them did not involve a pivotal moment when someone decided to proceed with a product despite knowing how it could be abused or misused. Rather, they usually emanate from an unconscious design decision that had unintended impacts.

It’s natural that most tech developers have a bias toward imagining the ways their products can benefit society. To counteract this, employees need tools to think beyond the most evident use-cases; help predict a range of harms, from bias and discrimination to tech addiction to emboldening extremists; and develop strategies to mitigate those outcomes. EthicalOS and DotEveryone offer toolkits that a number of ethics executives have successfully used to this end.

Identifying red flags is just the first step. There needs to be a process to ensure that those red flags are raised to an appropriate level of seniority and adjudicated transparently and consistently. Some ethics executives experimented with creating a new process for these “ethics checkpoints,” but quickly realized that this unduly burdened the notoriously tight product development cycle, or was ignored altogether.

What’s proven to be more effective is piggy-backing on processes that are already well-engrained in the product development roadmap, such as those that have been created in recent years related to cybersecurity, environmental sustainability, and accessibility. This allows straightforward concerns to be addressed quickly, and more complex or sensitive ones can be escalated for deeper review. The more you can make an engineer’s life easier, the more likely your ethics review process will succeed.

As tempting as it may be to see a new “ethics office” as the panacea for a company’s problems, ethics executives quickly realized that they were unable to keep up with the demands for support from across the company, no matter how big their new department grew. Devoting your whole ethics teams to dive deep on a few controversial topics or complex new products for a few months is useful to initially hone a methodology. But that approach doesn’t scale when there is a need for attention and consideration across all products and features.

Instead, companies like Microsoft are now finding success with training “ambassadors” or “champions” embedded in teams—usually a role they play in addition to their regular jobs—to bring their teams heightened sensitivity toward unintended impacts, as well as to help their teams navigate raising flags and escalating concerns. Empowering people within teams ensures that they have the contextual intelligence as well as pre-existing credibility needed to be trusted and effective.

Source: Wired

Powered by NewsAPI.org

Keywords:

PragmatismEthicsSocietyState (polity)WorldGovernmentLeadershipSocietyInnovationTraditionPolicyLevel of measurementTechnological changeGovernanceCorporationLeadershipTrust (emotion)PublicConsumerEmploymentActivismAggressionIndustry self-regulationCorporationNew product developmentWorld Economic ForumEthicsOrganizationLeadershipNegative libertyAccountabilityDecision-makingProduct (business)DiscriminationAdvertisingHate speechConsciousnessDecision-makingNatureBiasProduct (business)EmploymentToolBiasDiscriminationExtremismEthicsSoftware development processSystems engineeringNew product developmentComputer securitySustainabilityComputer accessibilityComplexityMilitary engineeringLifeEthicsScientific methodEthicsPanaceaEthicsHolismEthicsComplexityMethodologyMicrosoftEmploymentIntelligence