“Automated systems discriminate by default,” said Broussard, author of “Artificial Unintelligence: How Computers Misunderstand the World.” They are built using data from historically unequal systems and therefore spit out the same skewed outputs. Or as Senator Elizabeth Warren put it in an interview Thursday, “algorithms are only as good as the data that gets packed into them.”

In the case of Apple Card, although it’s not yet clear if a sexist algorithm was at fault (all we have is anecdotal evidence), technology, so often heralded as a great equalizer, seems to have just perpetuated unequal access to the banking system.

Until 1975, single, divorced or widowed women in the U.S. needed a man to co-sign their credit card applications. In some cases, women had to present a doctor’s note proving that they were using contraceptives before they could qualify for a mortgage loan (the logic was that if they didn’t have babies, they wouldn’t quit their jobs). And it wasn’t until 2013 that the Consumer Financial Protection Bureau urged banks and credit card issuers to consider shared incomes in order to make it easier for stay-at-home spouses to qualify for cards.

“We should just be really thoughtful about when we do and don’t use technology,” said Broussard. And more inclusive development teams would also help pick up on and alleviate the kind of blind spots that landed Apple in this situation in the first place, she added.

Apple raised the credit limit for Hansson’s wife, Jamie, after his tweet went viral.

“I felt the weight and guilt of my ridiculous privilege,” she wrote in a powerfully worded statement. “Justice for another rich white woman is not justice at all.”

Have you noticed designs or products that seem to exclude certain groups? Tell me about it by writing in or on Twitter.