A man who feels wronged by his ex-wife thinks he can help ex-husbands everywhere with an artificially intelligent legal assistant that collects public court records to help clients file lawsuits and predicts what the opposing legal team will do next.

He also gave this piece of software a female avatar, a woman in a pencil skirt and heels he named Justine Falcon.

Brent Oster, founder and CEO of ORBAI, claims Justine helped him file a $400 million class action lawsuit against the District Attorney of Santa Clara County last week, on behalf of men in Santa Clara who claim they've been wrongfully prosecuted during divorce proceedings.

"I, BRENT OSTER, am a proud member of the class MEN, and I have not, and will not ever, harm anyone, especially someone who depends on me and I care about," Oster writes in the complaint. "I will NOT tolerate authorities and institutions who will falsely accuse me and prosecute me for such things."

Oster claims the complaint was compiled with the help of Justine, which automatically combed through the California Superior Court website (which holds records of all civil, criminal, and family law cases in California) for cases that are similar to the lead case in the lawsuit.

"The simple attire, nerdy glasses and attractive female is about the best look for all of those needs that we could find. Cute is a must."

"The software then created a list of all those cases as well as a list of the top 20 DA attorneys that prosecuted most of them," ORBAI said in a press release. "This list of DA prosecutors and lists of their similar cases were used as an Exhibit in the class certification section of the lawsuit filing."

According to an ORBAI spokesperson, the system can also predict what a given lawyer will do next, he said, by processing an attorney's cases and finding patterns in their methods.

Oster has packaged all of this into what he calls a "cute, talking 3D animated character" named Justine Falcon, who has two sides to her personality. There's the front-end Justine Falcon, who looks like Elle Woods from Legally Blonde complete with a stereotypical argyle sweater, pencil skirt, oversized glasses, and tight ponytail, and who ORBAI's website claims can interview clients "like a human legal assistant would, to get all the facts about a case so she can condense them into a correctly formatted legal report."

Then there's the back-end Justice Falcon, who kind of looks like Lady Justice—without a blindfold and with a big sword— who handles the "data mining, case analysis, and litigation planning tools," according to Oster.

Oster told Motherboard these personas are meant to give the system an approachable look.

"The simple attire, nerdy glasses and attractive female is about the best look for all of those needs that we could find," he said. "Cute is a must."

Oster claims that Justine's capabilities are fearsome, that it's capable of deploying an "information warfare package" and is "a weapons system that can defeat any team of human lawyers, always win in litigation, and when used to win strategic cases, move history."

"[In the judicial system] there are definitely certain biases and there are things that make sense and things that don't make sense," attorney Sofia Balile, who specializes in divorce law, told Motherboard. "There are things that are not equitable, but that won't be fixed with an AI paralegal. That's just the system and the people that are part of it."

"As soon as users assign gender to a machine, stereotypes follow," Stanford historian of science and gender Londa Schiebinger wrote in a 2019 whitepaper.

"The danger is that gendering robots may reinforce gender inequalities by hardening current stereotypes." A U.N. report published last year urges against making digital assistants female by default. “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants—and penalized for not being assistant-like," the researchers said in that study.

I asked Oster why he felt the need to make it "cute," or female, or gender it at all—something many technologists and roboticists agree isn't useful or helpful in creating bots, and risks further reinforcing stereotypes about those genders.

Oster said it needed to be "approachable" and "not too intimidating... she needs to look cute and unassuming to the public because she is a weapon, an assassin in litigation, and people will fear her."

If it had a male persona instead, he said, "people would have pitchforks and torches at the doors of ORBAI within the next eight months."

Joshua Browder, whose AI legal advice chatbot Do Not Pay helps people draw up legal documents to take to small claims court and has no avatar, gender, or persona, disagreed. "I think it's inappropriate to gender the robot," he said. "These are really serious cases and it will just upset people."