At the time, the Education Department had sponsored only a few randomized trials. One was a study of Upward Bound, a program that was thought to improve achievement among poor children. The study found it had no effect.

So Dr. Whitehurst brought in new people who had been trained in more rigorous fields, and invested in doctoral training programs to nurture a new generation of more scientific education researchers. He faced heated opposition from some people in schools of education, he said, but he prevailed.

The studies are far from easy to do.

“It is an order of magnitude more complicated to do clinical trials in education than in medicine,” said F. Joseph Merlino, president of the 21st Century Partnership for STEM Education, an independent nonprofit organization. “In education, a lot of what is effective depends on your goal and how you measure it.”

Then there is the problem of getting schools to agree to be randomly assigned to use an experimental program or not.

“There is an art to doing it,” Mr. Merlino said. “We don’t usually go and say, ‘Do you want to be part of an experiment?’ We say, ‘This is an important study; we have things to offer you.’ ”

As the Education Department’s efforts got going over the past decade, a pattern became clear, said Robert Boruch, a professor of education and statistics at the University of Pennsylvania. Most programs that had been sold as effective had no good evidence behind them. And when rigorous studies were done, as many as 90 percent of programs that seemed promising in small, unscientific studies had no effect on achievement or actually made achievement scores worse.

For example, Michael Garet, the vice president of the American Institutes for Research, a behavioral and social science research group, led a study that instructed seventh-grade math teachers in a summer institute, helping them understand the math they teach — like why, when dividing fractions, do you invert and multiply?