Science & Tech

The Ethics Team
At Microsoft, Sharon Lo ’16 ponders how products can harm society

By Jack Brook '19 / November–December 2020
October 23rd, 2020
Portrait of Sharon Lo ’16
Photo: Christine Yoo

Microsoft constantly develops new technology, and Sharon Lo ’16 considers how those creations could wreak havoc upon the world. As a member of Microsoft’s Ethics & Society organization, Lo actually gets paid by the company to do this.

“You’re coming to a team where they’ve worked on [a product] for months,” Lo says. “And we say, ‘Okay, let’s brainstorm all the bad things your technology can do to harm people.’” 

Lo serves as a product manager, helping developers address and reconcile the hard questions raised when producing novel technologies. After studying computer science at Brown and working for several years in a more conventional role at Microsoft, Lo wanted to dive into how the company’s technologies influenced society—and talked her way into a position within the company’s relatively new ethics branch. 

This year, Lo spent considerable time thinking about Microsoft’s recently released Custom Voice, a way to design synthetic voices (or what the company calls “voice fonts”) for an array of purposes. Inevitably, the ability to customize a realistic digital voice led to philosophical challenges for Lo and the rest of her team. Among them: How could the tech spread misinformation? Would a world with digitally-created voices undermine the authenticity of real human speech? 

To approach these kinds of issues, Lo and her team rely on a “harms framework”  they developed, influenced by the UN’s Universal Declaration of Human Rights and intended to build accountability into products. They use the framework in part to consider which societal groups are most likely to be negatively impacted by a given technology, whether directly or indirectly, and then they assess how to mitigate that harm. This process often includes focus groups. In the case of Custom Voice, Lo’s team spoke with voice actors. 

The company’s public ethics principles also offer broad values-based guidelines, such as promoting fairness and transparency, for addressing potential harms. Yet the increasing adoption of these sorts of reforms across the tech industry has been subjected to much scrutiny, as critics ask whether companies are merely using ethics as a public relations strategy.

Lo acknowledges that pushing for ethical decision-making in a private sector tech company is not always easy but argues that in the absence of clear and effective regulations, teams like hers are essential. She points out that new technologies arise quickly, leading to specific ethical questions requiring in-depth thought and attention—hard to meaningfully address with the broad stroke of law. Companies like Microsoft should be thinking of ethics not as a matter of compliance but as a practice built into the process of innovation itself, she argues.

Microsoft decided to limit Custom Voice to approved and vetted companies, requiring them to verify receiving written and informed consent from voice actors. They laid out additional guidelines for the appropriate amount of disclosure that companies would need to include when employing a synthetic but realistic sounding voice so as not to openly deceive people.

But the vetting itself proves difficult. One company asked Microsoft if it could use the Custom Voice to recreate voices of the deceased. Lo’s team decided that given the absence of clear consent—how could the dead have known their voices could ever be regenerated?—the company should not be allowed to use the technology.

“Sometimes, I’m like, ‘Who am I to answer this question?’” Lo says. “But I’ve always been really interested in how we think about what’s right versus what’s wrong, and how we rationally build that into our principles and models.” 

What do you think?
See what other readers are saying about this article and add your voice. 
Related Issue
November–December 2020