When the investigative journalist Julia Angwin worked for ProPublica, the nonprofit news organization became known as "big tech's scariest watchdog."

By partnering with programmers and data scientists, Ms. Angwin pioneered the work of studying big tech's algorithms — the secret codes that have an enormous impact on everyday American life. Her findings shed light on how companies like Facebook were creating tools that could be used to promote racial bias, fraudulent schemes and extremist content.

Now, with a $20 million gift from the Craigslist founder Craig Newmark, she and her partner at ProPublica, the data journalist Jeff Larson, are starting The Markup, a news site dedicated to investigating technology and its effect on society. Sue Gardner, former head of the Wikimedia Foundation, which hosts Wikipedia, will be The Markup's executive director. Ms. Angwin and Mr. Larson said that they would hire two dozen journalists for its New York office and that stories would start going up on the website in early 2019. The group has also raised $2 million from the John S. and James L. Knight Foundation, and $1 million collectively from the Ford Foundation, the John D. and Catherine T. MacArthur Foundation, and the Ethics and Governance of Artificial Intelligence Initiative.

Ms. Angwin compares tech to canned food, an innovation that took some time to be seen with more scrutiny.

More from The New York Times:

"When canned food came out, it was amazing," said Ms. Angwin, who will be the site's editor in chief. "You could have peaches when they were out of season. There was a whole period of America where every recipe called for canned soup. People went crazy for canned food. And after 30 years, 40 years, people were like, 'Huh, wait.'

"That is what's happened with technology," Ms. Angwin said, calling the 2016 election a tipping point. "And I'm so glad we've woken up."

The site will explore three broad investigative categories: how profiling software discriminates against the poor and other vulnerable groups; internet health and infections like bots, scams and misinformation; and the awesome power of the tech companies. The Markup will release all its stories under a creative commons license so other organizations can republish them, as ProPublica does.

Ms. Angwin, who was part of a Wall Street Journal team that won a Pulitzer Prize in 2003 for coverage of corporate corruption, said the newsroom would be guided by the scientific method and each story would begin with a hypothesis. For example: Facebook is allowing racist housing ads. At ProPublica, Ms. Angwin's team bought ads on the site and proved the hypothesis.

At The Markup, journalists will be partnered with a programmer from a story's inception until its completion.

"To investigate technology, you need to understand technology," said Ms. Angwin, 47. "Just like I got an M.B.A. when I was a business reporter, I believe that technologists need to be involved from the very beginning of tech investigations."

Ms. Angwin has known Mr. Newmark since 1997, when she wrote about him while a reporter at The San Francisco Chronicle.

"Craig is ideal for us because he has no interest or temperament for trying to interfere in coverage," she said.

Mr. Newmark, who splits his time between San Francisco and New York, has for years kept a low profile. But he worries about what he sees as a lack of self-reflection among engineers.

"Sometimes it takes an engineer a while to understand that we need help, then we get that help, and then we do a lot better," Mr. Newmark said. "We need the help that only investigative reporting with good data science can provide."

Craigslist, which Mr. Newmark founded in the mid-1990s, helped to decimate print newspapers' main source of revenue at the time: classified advertising. Recently, he has given several substantial donations to journalistic institutions, including $20 million to the CUNY Graduate School of Journalism.

"We're in an information war now," Mr. Newmark said.

For many years, the outrageous success of Silicon Valley companies — and the aggressive public relations teams who worked for them — kept many journalists at a remove.

The societal effects of tech were hard to quantify, and moral responsibility was often sloughed off on something called an algorithm, which most people could not quite explain or examine. Even if, as in the case of Facebook, it influenced around 2.5 billion people.

At ProPublica, Ms. Angwin and Mr. Larson subverted the traditional model of tech reporting altogether. They did not need access. With the right tools, they could study impact.

"There's an opportunity for more reporters to use statistics to uncover societal harms," said Mr. Larson, who has been doing data-driven journalism for a decade. "And then Julia's gift is she takes data journalism and doesn't make it like an academic report."

Some of Ms. Angwin and Mr. Larson's reporting tactics may violate tech platform terms of service agreements, which ban people from performing automated collection of public information and prohibit them from creating temporary research accounts. Ms. Angwin has been a strong defender of these practices and has argued that tech companies ought to allow reporters to be an exception to their rules.

"Without violating those rules, journalists can't investigate our most important platform for public discourse," Ms. Angwin wrote in August.

The two worked together on investigations like one into criminal sentencing software, which took a year. Ms. Angwin would report and write. Mr. Larson would measure and analyze. In the end, they proved that the algorithm was racially biased.

Mr. Larson, who will be The Markup's managing editor, said the result was just as much a surprise to readers as it was to those who had made the biased algorithm.

"Increasingly, algorithms are used as shorthand for passing the buck," said Mr. Larson, 36. "We don't have enough people to look at parole decisions, so we're going to pass it on to the computer and the computer is going to decide, and once they go into production, there's no oversight."

The two also showed how big tech companies were helping extremist sites make money, how African-Americans were overcharged for car insurance, and how Facebook allowed political ads that were actually scams and malware.

"There are unintended consequences," Mr. Larson said. "In all three of those cases, it was a complete surprise to the people who made those algorithms as well."

Engineers being surprised by the tools they have made is, to the Markup team, part of the problem.

"Part of the premise of The Markup is the level of understanding technology and its effects is very, very low, and we would all benefit from a broader understanding," Ms. Gardner said. "And I would include people who work for the companies."

Ms. Angwin said part of her goal was to help readers understand what exactly they should be worried about when it comes to tech.

"We're all a little uncertain," Ms. Angwin said. "The evidence isn't in. I want to be providing the evidence."

She hopes the stories they take on will lead to better government and corporate policies.

"We are a numbers-driven data society," Ms. Angwin said. "That's the price of entry these days for political change — a data set."

And searching for that information, Ms. Angwin said she was not worried about getting Facebook or Google to return her phone calls.

"I've never been on Google's or Facebook's campus and I imagine I'll never be invited," she said. "I'm kind of a dorky scientist just over here measuring stuff."

—Nellie Bowles covers tech and internet culture. Follow her on Twitter: @nelliebowles