27 May People Can’t Be Trusted To Make Unbiased Hiring Decisions. So This Woman Created A Computer Program.
Laura Mather knows what hiring bias looks like. She’s a woman who has spent her career in technology, after all. Nearly a decade ago, she applied to work for Google’s risk management division. “I had the perfect experience for risk management,” she said, given that she had worked in that very department for eBay. But that didn’t necessarily cut it. “When the recruiter called to offer me the job, she said, ‘Hey, we’re offering you the job, but you need to know that [CEO] Larry Page…almost vetoed you because you didn’t go to an Ivy League school,” she recounted. “Yet I had graduated 12 years before.”
But she also thinks she has a tool that can fix it. She has created Unitive, a technology company aimed at helping employers overcome unconscious hiring bias in their quest to increase diversity. It’s still in the pilot stage, although she says she hopes to announce some “big customers” soon.
The technology does a number of things, starting with the job listing itself. Companies, particularly those in technology, may be describing the job and requirements with coded language that excludes women and people of color. One study found that masculine words like “competitive” and “dominant” turns women off. So Mather’s technology gives people writing these descriptions information and tools to make them more gender neutral.
Its reach then extends into resume reviewing and interviewing. “When we hire people, we use the baseline that we have, which is ourselves,” Mather explained. And bias for one particular thing in one part of the resume, such as whether or not a candidate graduated from an Ivy, can “contaminate,” as Mather puts it, the rest of the resume. So Unitive not only anonymizes the gender and race of an applicant, but compartmentalizes components like education, work history, and hobbies so a hiring manger only sees one piece at a time without the influence of the others. “You can still be biased,” she noted, pointing out that there can still be preference given to graduates of Harvard or Yale. But that preference wouldn’t result in a higher rating on someone’s personal skills.
Then during the interview itself, “We are constantly reminding people about what is most important to that job so that we can disrupt that pattern mismatch that is usually irrelevant,” she said. That can either be done through a computer system that gives regular prompts or simply through a printed out template of questions that include reminders of what the interviewer should really be focused on.
Mather’s journey to starting the company wasn’t just informed by her interview at Google. After selling a risk management startup she had founded, she thought about what to do next. “I was really dismayed with the lack of diversity in technology companies and startups,” she said. “Cyber security’s really bad, but it’s also in the federal government, in academia. There was no shortage of places that needed a better representation of diverse groups.”
But she’s found that generally, the visible, grotesque bias of decades past is by and large not what women and people of color are up against. “The problem is the bias that is now occurring is unconscious,” she said. “Fighting the behavior that’s caused by unconscious bias is a different fight than fighting the overt behavior of Mad Men days.” One way is through intensive training and discussion sessions with psychologists, but “that didn’t seem scalable,” she said. So she decided to use her background in technology to get at the problem.
“When I first looked at this problem, I tried to figure out where I could make the biggest difference,” she said. Many technology companies talk about a pipeline that doesn’t have enough women and people of color because they get discouraged from the field in school and early in their careers. But that wasn’t the solution for Mather. “Absolutely the pipeline can get better,” she said. “But that’s not going to fix the problem.”
Women earn over 40 percent of science and engineering degrees but hold just over a quarter of technology jobs, and among science and engineering grads, men are employed in science and technology jobs at twice the rate of women. Meanwhile, black students make up 4.5 percent of computer science or engineering graduates and Hispanics make up 6.5 percent, but they make up just 2 percent and 3 percent, respectively, of technology employees at Silicon Valley companies. “What I would hate to see happen is we spend a ton of money and resources and effort on the pipeline…and then when they get to the doors of these organizations, the unconscious bias in the hiring process means that they don’t have the same opportunity,” Mather said.
It’s also not enough for her to have executives who say they want to increase diversity. “You can’t just talk about it,” she said. “You definitely need leadership to be modeling the right behaviors and be reinforcing the culture change. But…it just isn’t enough.” She noted that when she worked for eBay, leadership talked about diversity but never made policy changes that were easy to implement. “My platform can actually change behaviors from the ground up such that the culture change will happen,” she said.
And she won’t just be focused on the hiring process. After all, many women and people of color can end up pushed out after they land jobs. Next on the agenda: analytics on the hiring process to determine how it went, then moving on to rooting out bias in performance reviews, talent development, and promotions.
Steven_Sudler
Posted at 14:33h, 29 MayPeople Can’t Be Trusted To Make Unbiased Hiring Decisions. So This Woman Created A Computer Program.
http://t.co/cG1R8zQLBo