This article is from the source 'nytimes' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.nytimes.com/2020/03/10/us/algorithms-learn-our-workplace-biases-can-they-help-us-unlearn-them.html

The article has changed 3 times. There is an RSS feed of changes available.

Version 1 Version 2
Algorithms Learn Our Workplace Biases. Can They Help Us Unlearn Them? Algorithms Learn Our Workplace Biases. Can They Help Us Unlearn Them?
(2 days later)
— Iris Bohnet, a behavioral economist and professor at the Harvard Kennedy School— Iris Bohnet, a behavioral economist and professor at the Harvard Kennedy School
[In Her Words is available as a newsletter. Sign up here to get it delivered to your inbox.][In Her Words is available as a newsletter. Sign up here to get it delivered to your inbox.]
In 2014, engineers at Amazon began work on an artificially intelligent hiring tool they hoped would change hiring for good — and for the better. The tool would bypass the messy biases and errors of human hiring managers by reviewing résumé data, ranking applicants and identifying top talent.In 2014, engineers at Amazon began work on an artificially intelligent hiring tool they hoped would change hiring for good — and for the better. The tool would bypass the messy biases and errors of human hiring managers by reviewing résumé data, ranking applicants and identifying top talent.
Instead, the machine simply learned to make the kind of mistakes its creators wanted to avoid.Instead, the machine simply learned to make the kind of mistakes its creators wanted to avoid.
The tool’s algorithm was trained on data from Amazon’s hires over the prior decade — and since most of the hires had been men, the machine learned that men were preferable. It prioritized aggressive language like “execute,” which men use in their CVs more often than women, and downgraded the names of all-women’s colleges. (The specific schools have never been made public.) It didn’t choose better candidates; it just detected and absorbed human biases in hiring decisions with alarming speed. Amazon quietly scrapped the project.The tool’s algorithm was trained on data from Amazon’s hires over the prior decade — and since most of the hires had been men, the machine learned that men were preferable. It prioritized aggressive language like “execute,” which men use in their CVs more often than women, and downgraded the names of all-women’s colleges. (The specific schools have never been made public.) It didn’t choose better candidates; it just detected and absorbed human biases in hiring decisions with alarming speed. Amazon quietly scrapped the project.
Amazon’s hiring tool is a good example of how artificial intelligence — in the workplace or anywhere else — is only as smart as the input it gets. If sexism or other biases are present in the data, machines will learn and replicate them on a faster, bigger scale than humans could do alone.Amazon’s hiring tool is a good example of how artificial intelligence — in the workplace or anywhere else — is only as smart as the input it gets. If sexism or other biases are present in the data, machines will learn and replicate them on a faster, bigger scale than humans could do alone.
On the flip side, if A.I. can identify the subtle decisions that end up excluding people from employment, it can also spot those that lead to more diverse and inclusive workplaces.On the flip side, if A.I. can identify the subtle decisions that end up excluding people from employment, it can also spot those that lead to more diverse and inclusive workplaces.
Humu Inc., a start-up based in Mountain View, Calif., is betting that, with the help of intelligent machines, humans can be nudged to make choices that make workplaces fairer for everyone, and make all workers happier as a result.Humu Inc., a start-up based in Mountain View, Calif., is betting that, with the help of intelligent machines, humans can be nudged to make choices that make workplaces fairer for everyone, and make all workers happier as a result.
A nudge, as popularized by Richard Thayer, a Nobel-winning behavioral economist, and Cass Sunstein, a Harvard Law professor, is a subtle design choice that changes people’s behavior in a predictable way, without taking away their right to choose. A nudge, as popularized by Richard Thaler, a Nobel-winning behavioral economist, and Cass Sunstein, a Harvard Law professor, is a subtle design choice that changes people’s behavior in a predictable way, without taking away their right to choose.
Laszlo Bock, one of Humu’s three founders and Google’s former H.R. chief, was an enthusiastic nudge advocate at Google, where behavioral economics — essentially, the study of the social, psychological and cultural factors that influence people’s economic choices — informed much of daily life.Laszlo Bock, one of Humu’s three founders and Google’s former H.R. chief, was an enthusiastic nudge advocate at Google, where behavioral economics — essentially, the study of the social, psychological and cultural factors that influence people’s economic choices — informed much of daily life.
Nudges showed up everywhere, like in the promotions process (women were more likely to self-promote after a companywide email pointed out a dearth of female nominees) and in healthy-eating initiatives in the company’s cafeterias (placing a snack table 17 feet away from a coffee machine instead of 6.5 feet, it turns out, reduces coffee-break snacking by 23 percent for men and 17 percent for women).Nudges showed up everywhere, like in the promotions process (women were more likely to self-promote after a companywide email pointed out a dearth of female nominees) and in healthy-eating initiatives in the company’s cafeterias (placing a snack table 17 feet away from a coffee machine instead of 6.5 feet, it turns out, reduces coffee-break snacking by 23 percent for men and 17 percent for women).
Humu uses artificial intelligence to analyze its clients’ employee satisfaction, company culture, demographics, turnover and other factors, while its signature product, the “nudge engine,” sends personalized emails to employees suggesting small behavioral changes (those are the nudges) that address identified problems.Humu uses artificial intelligence to analyze its clients’ employee satisfaction, company culture, demographics, turnover and other factors, while its signature product, the “nudge engine,” sends personalized emails to employees suggesting small behavioral changes (those are the nudges) that address identified problems.
One key focus of the nudge engine is diversity and inclusion. Employees at inclusive organizations tend to be more engaged. Engaged employees are happier, and happier employees are more productive and a lot more likely to stay.One key focus of the nudge engine is diversity and inclusion. Employees at inclusive organizations tend to be more engaged. Engaged employees are happier, and happier employees are more productive and a lot more likely to stay.
With Humu, if data shows that employees aren’t satisfied with an organization’s inclusivity, for example, the engine might prompt a manager to solicit the input of a quieter colleague, while nudging a lower-level employee to speak up during a meeting. The emails are tailored to their recipients, but are coordinated so that the entire organization is gently guided toward the same goal.With Humu, if data shows that employees aren’t satisfied with an organization’s inclusivity, for example, the engine might prompt a manager to solicit the input of a quieter colleague, while nudging a lower-level employee to speak up during a meeting. The emails are tailored to their recipients, but are coordinated so that the entire organization is gently guided toward the same goal.
Unlike Amazon’s hiring algorithm, the nudge engine isn’t supposed to replace human decision-making. It just suggests alternatives, often so subtly that employees don’t even realize they’re changing their behavior.Unlike Amazon’s hiring algorithm, the nudge engine isn’t supposed to replace human decision-making. It just suggests alternatives, often so subtly that employees don’t even realize they’re changing their behavior.
Jessie Wisdom, another Humu founder and former Google staff member who has a doctorate in behavioral decision research, said sometimes she would hear from people saying, “Oh, this is obvious, you didn’t need to tell me that.”Jessie Wisdom, another Humu founder and former Google staff member who has a doctorate in behavioral decision research, said sometimes she would hear from people saying, “Oh, this is obvious, you didn’t need to tell me that.”
Even when people may not feel the nudges are helping them, she said, data would show “that things have gotten better. It’s interesting to see how people perceive what is actually useful, and what the data actually bears out.”Even when people may not feel the nudges are helping them, she said, data would show “that things have gotten better. It’s interesting to see how people perceive what is actually useful, and what the data actually bears out.”
In part that’s because the nudge “doesn’t focus on changing minds,” said Iris Bohnet, a behavioral economist and professor at the Harvard Kennedy School. “It focuses on the system.” The behavior is what matters, and the outcome is the same regardless of the reason people give themselves for doing the behavior in the first place.In part that’s because the nudge “doesn’t focus on changing minds,” said Iris Bohnet, a behavioral economist and professor at the Harvard Kennedy School. “It focuses on the system.” The behavior is what matters, and the outcome is the same regardless of the reason people give themselves for doing the behavior in the first place.
Of course, the very idea of shaping behavior at work is tricky, because workplace behaviors can be perceived differently based on who is doing them.Of course, the very idea of shaping behavior at work is tricky, because workplace behaviors can be perceived differently based on who is doing them.
Take, for example, the suggestion that one should speak up in a meeting. Research from Victoria Brescoll at the Yale School of Management found that people rated male executives who spoke up often in meetings as more competent than peers; the inverse was true for female executives. At the same time, research from Robert Livingston at Northwestern’s Kellogg School of Management found that for black American executives, the penalties were reversed: Black female leaders were not penalized for assertive workplace behaviors, but black male executives were.Take, for example, the suggestion that one should speak up in a meeting. Research from Victoria Brescoll at the Yale School of Management found that people rated male executives who spoke up often in meetings as more competent than peers; the inverse was true for female executives. At the same time, research from Robert Livingston at Northwestern’s Kellogg School of Management found that for black American executives, the penalties were reversed: Black female leaders were not penalized for assertive workplace behaviors, but black male executives were.
An algorithm that generates one-size-fits-all fixes isn’t helpful. One that takes into account the nuanced web of relationships and factors in workplace success, on the other hand, could be very useful.An algorithm that generates one-size-fits-all fixes isn’t helpful. One that takes into account the nuanced web of relationships and factors in workplace success, on the other hand, could be very useful.
So how do you keep an intelligent machine from absorbing human biases? Humu won’t divulge any specifics — that’s “our secret sauce,” Wisdom said.So how do you keep an intelligent machine from absorbing human biases? Humu won’t divulge any specifics — that’s “our secret sauce,” Wisdom said.
It’s also the challenge of any organization attempting to nudge itself, bit by bit, toward something that looks like equity.It’s also the challenge of any organization attempting to nudge itself, bit by bit, toward something that looks like equity.
[In Her Words is available as a newsletter. Sign up here to get it delivered to your inbox.][In Her Words is available as a newsletter. Sign up here to get it delivered to your inbox.]
Correction: March 10, 2020Correction: March 10, 2020
An earlier version of this article misspelled the name of a professor. He is Robert Livingston, not Livingstone. An earlier version of this article misspelled the name of a professor. He is Robert Livingston, not Livingstone. It also misspelled the name of a Nobel-winning behavioral economist. He is Richard Thaler, not Thayler.