Now, one company is reporting that algorithmic hiring can also improve diversity. Infor Talent Science provides software that helps companies hire by collecting behavioral information using a survey, then making a predictive model based on top performers. They then hire based on how candidates match up with those top performers. The company dug into data on 50,000 hires for their clients and found an average increase of 26 percent in African Americans and Hispanics hires across a variety of industries and jobs after deploying Infor’s software.

“What we've found is regardless of [the industry], whether it's restaurants, retail, call centers—it actually increases the diversity of the population,” says Jason Taylor, Infor’s chief scientist for human capital management. In Infor’s forthcoming report, they found that using an algorithm to help with hiring increased their wholesale clients’ Hispanic hires by 31 percent. For their restaurant clients, African American hires increased by 60 percent.

“What a systematic process does is it knows no color, no race, no ethnicity,” says Taylor. “When [a hiring manager] doesn't know a person and they don't know what to look for, they basically hire people like themselves. It's ‘We have something in common,’ or ‘Oh, I like you,’ then it's ‘Okay you're hired.’ What this does is it provides them with an objective piece of information that shows the probability that they're going to be successful in the role. So it helps to qualify that pool.”

One of the caveats of Infor’s study is that their data is only based on hires who disclosed ethnic background. As with most surveys, checking the racial box is voluntary. Collecting racial data has long been tricky as candidates often worry that that it will result in discrimination. (The Census Bureau too suffers from this problem, and it is experimenting with new ways to collecting data about race and origin.) But it’s not clear that, in the end, minority candidates are undercounted: Others might believe that disclosing race will attract diversity-minded employers.

So will algorithms rid the hiring process of bias? Scholars warn that big data’s supposed objectivity can mask other biases built into the algorithms. Chelsea Barabas, a researcher at MIT’s Center for Civic Media, writes:

Decisions based on algorithms, are becoming “used for everything from predicting behavior to denying opportunity” in a way that “can mask prejudices while maintaining a patina of scientific objectivity.” These concerns are echoed by other scholars such as Kate Crawford, who has made incisive arguments against the claim that big data doesn’t discriminate against social groups ... The peril of these algorithms is that they mask deep seated biases behind the promise that the numbers “speak for themselves.”

There’s plenty of research on the reasons that diversity is good for the workplace: It increases productivity; it enhances problem solving; it’s even been shown to increase sales and improve profits. The question of whether workplace diversity is good seems to have been answered, but the question of how to attain such diversity seems to be the more baffling one.

At least the early results seem to indicate that algorithmic hiring can help reduce biases, but an employer has to care about doing so. In other words, though Infor’s results are encouraging, what matters most is that companies are genuinely interested in increasing diversity in the workplace.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.