Democratic senator and U.S. presidential candidate Elizabeth Warren on Wednesday chided Apple Card financial partner Goldman Sachs over the firm's response to claims that its algorithms introduce a credit limit bias against women.

Further, Sen. Ron Wyden (D-Ore.), ranking member of the Senate Finance Committee, on Wednesday said he is investigating the veracity of recent Apple Card bias claims.

Last week, Goldman and Apple came under fire for what appeared to be a gender bias in the Apple Card application process. Customers began to cry foul after entrepreneur David Heinemeier Hansson blasted the companies in a series of tweets after being granted a credit limit some 20 times higher than that of his wife.

Goldman, facing an announced investigation from the New York Department of Financial Services, issued multiple responses early this week refuting the claims of Hansson and others in no uncertain terms.

"We have not and never will make decisions based on factors like gender," Goldman Sachs Bank CEO Carey Halio said in statement on Monday. "In fact, we do not know your gender or marital status during the Apple Card application process."

Halio went on to tell customers slighted by the process that they could ask for a reevaluation, which the bank would fulfill "based on additional information that we request."

Goldman's reaction plan is inadequate, Warren intimated in an interview with Bloomberg.

"Yeah, great. So let's just tell every woman in America, You might have been discriminated against, on an unknown algorithm, it's on you to telephone Goldman Sachs and tell them to straighten it out,'" Warren said. "Sorry guys, that's not how it works."

Consumer advocates and policymakers like Warren have voiced concern over computer algorithms, some of which might produce biased results even barring data points like race and gender, the report said.

Warren called on Goldman to produce information about how its algorithm was designed and how it impacts applicants. If it is unable to do so, she added, the bank needs "to pull it down."

"We're all beginning to understand better that algorithms are only as good as the data that gets packed into them," Warren said. "And if a lot of discriminatory data gets packed in, in other words, if that's how the world works, and the algorithm is doing nothing but sucking out information about how the world works, then the discrimination is perpetuated."