So too are high schools widely thought to be “life-changing”—the elite ones that students must test into. In a 2014 Econometrica paper titled “The Elite Illusion,” the economists Atila Abdulkadiroğlu, Joshua Angrist, and Parag Pathak wrote that while students who attend extremely competitive public schools like Stuyvesant High School in New York City clearly excel, that may not mean the schools provide an education that’s superior to their less competitive counterparts. The researchers looked at a group of borderline kids, the last few eighth-graders who made the cut-off to go to an elite school and the first few who didn't; that meant there was little if any academic difference between them when they started their freshman year. If a school like Stuyvesant were more effective—that is, taught more material and produced better outcomes—than the less competitive public school, the economists would expect to see a difference in how those kids performed academically four years later. But when the researchers analyzed indicators of success, such as AP exam scores and state standardized tests, they saw no difference between the borderline kids who got to attend Stuyvesant and the borderline ones who didn’t. And yet, said Pathak, a professor of microeconomics at MIT, “these are massively oversubscribed schools. People would give an arm and a leg to send their child to a school like Stuyvesant.”

That raises the question: Are parents able to figure out which schools are doing the best job? The new working paper—published by the National Bureau of Economic Research (NBER) and authored by Abdulkadiroğlu, Pathak, Jonathan Schellenberg, and Christopher Walters—discusses data from the New York City Department of Education, which enrolls around 90,000 ninth-graders every year at more than 400 high schools. For better or worse, the city’s high-school system doesn’t rely on an automatic process where students head to the school down the block; instead, eighth-graders submit a ranked list of up to 12 high schools they’d like to attend, some of which, like Stuyvesant, require a test. That system equipped the researchers with quite a bit of information about which schools parents and their children are choosing. Their task was to use other data about the students and schools to figure out what drives those choices.

Do parents prefer a school nearby? Yes, they reliably gave higher rankings to schools located in their own borough than they did to those elsewhere. But what about choosing between two nearby schools? Here, things get trickier. The economists had a pretty good measure of the type of gas in each tank: the eighth-grade test scores of schools’ incoming students. They also wanted to figure out fuel efficiency—how well each school helps kids advance academically regardless of where they start out their freshman year. In order to do so, they looked for similar students—ones who shared the same gender and race, lived in the same neighborhood, and got the same eighth-grade test scores—who went to different high schools. The researchers identified many of these “matched pairs” and looked at follow-up data similar to that used for the Stuyvesant study: scores on state tests, PSAT scores, high-school graduation records, and college-enrollment information. Then they asked if the kids who went to school A did better at these things than did their essentially identical counterparts at school B; if so, they labeled school A more “effective” than school B.