What is going on in Canada? This week, a prestigious research group recalled a series of intellectual property reports that were highly favorable to copyright holders—and appear to have been plagiarized. And the Business Software Alliance admitted that its Canadian software piracy numbers were "estimates"—no surveying had been done in the country.

Irony alert

Both discoveries were made by law professor Michael Geist, a one-man IP wrecking ball. First up was a report from the Conference Board of Canada, a nonpartisan research group that was asked to produce a report on the digital economy. When the report in question came out, the press release trumpeted, "Canada seen as the file-swapping capital of the world."

The report was largely paid for by copyright lobbyists, but a reputable group like the Conference Board wouldn't just slap something today together that supported the view of its funders—especially not a report making broad claims about the need for "increased emphasis on enforcement of [intellectual property rights]." Right?

Geist revealed that numerous sections of the Conference Board report were lifted nearly verbatim from an earlier report by the International Intellectual Property Alliance. Credit was sometimes given to the IIPA, though a mere citation doesn't give someone the right to use another's exact words without quote marks.

Michael Geist

And apart from the apparent plagiarism, many of the other numbers used in the report are of dubious provenance or are just sloppy. For instance, Geist notes that "the OECD study that the Conference Board says found the highest per capita incidence of unauthorized file sharing in the world [in Canada] did not reach that conclusion. The report—which is based on six year old data that is now out-of-date—was limited to the 30 OECD countries (not the world) and did not make any comment or determination on unauthorized activity."

This is cutting-edge research? Rightsholders might be fine with a Conference Board imprimatur on some IIPA quotes, but the government of Canada (which contributed $15,000 for the report) didn't get much for its money.

The Conference Board said that it "stands behind the findings of its report." The group prides itself on being "the foremost, independent, not-for-profit applied research organization in Canada" and says that it is "objective and non-partisan. We do not lobby for specific interests."

But the Conference Board has now announced on the front page of its website that it has recalled the reports in question because they "did not follow the high quality research standards of The Conference Board of Canada."

Hard numbers are hard to come by

Geist then took aim at the Business Software Alliance, which each year releases numbers estimating the rate of software piracy in countries around the world.

We've been skeptical about these numbers for years, since there's no reliable way to count pirated copies of software and no clarity about how much pirated software would have otherwise been purchased at full price.

Even the BSA concedes the point, with VP of Communications Dale Curtis telling us earlier this month, "I concede that, in this model, we express the value of pirated software in terms of retail value of what's being displaced. We assume 100 percent of the software would be replaced and we calculate the retail value of that. I don't know how much will be replaced or not. A lot of it will be replaced."

Hardly reassuring, but Geist has now made another charge: the BSA is literally "guessing" about the piracy rate in countries like Canada.

"While the study makes seemingly authoritative claims about the state of Canadian piracy, the reality is that IDC, which conducts the study for BSA, did not bother to survey in Canada," said Geist after checking with the organization. Surveys are only done in "volatile" countries.

"It is an express acknowledgement that the Canadian data this year is a guess," Geist added. "The data is never publicly presented in this way—the BSA cites specific numbers, the newspapers report it, and groups like the Conference Board of Canada and the Chamber of Commerce extrapolate these guesses into specific claims about job losses and economic harm."

Ars checked in again with Dale Curtis of the BSA here in the US. He had quite a bit to say, and it's worth quoting at some length:

It is not accurate to say BSA’s estimate of PC software piracy in Canada is a "guess," nor that our figures are "never publicly presented in this way." BSA has always been open about the methodology and the fact that it is a model, not a scientific measurement. The methodology is explained in detail in the white paper, which is online at www.bsa.org/globalstudy. Briefly put, IDC bases its piracy rates and loss estimates on a model that includes many well-established, authoritative data sets, including figures on new PC shipments, the installed PC base, new software shipments, and average software prices in each country. Data on the average amount of software loaded on PCs in a given country is based on more than 6,000 end-user surveys in 24 countries, plus reams of similar data from previous years. The survey sample size is large enough—and consistent enough over time and geography—that IDC believes it is as accurate as it can be. The surveys on software loads are just one piece of the overall model. Countries that are included in the survey portion are chosen to represent the more volatile economies. IDC has found from past research that low piracy countries, generally mature markets, have stable software loads by segment, with yearly variations driven more by segment dynamics (e.g. consumer shipment versus business shipments of PCs) than by load-by-load segment. IDC believes that in mature markets, piracy rates are driven less by changes in software load than other market conditions, such as shipment rates and volume licensing errors... Bottom line: There is no way to measure software piracy with scientific precision, and BSA does not present its figures as such. That said, the study uses a robust methodology that provides us with the best trend data available.

But such numbers are routinely cited as fact by others, showing up for decades in some cases as talking points in favor of tougher copyright laws. Our own past attempts at getting to the bottom of some of the figures commonly used by copyright maximalists shows that some are flat-out ludicrous; others, like the BSA numbers, are estimates based on models, and need to be treated as such by policymakers.

This is Geist's worry. "Rather than using broad bands to account for errors (i.e. 30-40 percent range), they use very specific figures and then cite even small increases or decreases," he tells Ars. "They do not provide a margin of error. If this is just a model without great precision, the BSA should not be using it to lobby policy makers on the basis that it provides a fairly precise figure."

All of these piracy numbers are tough, if not impossible, to measure with any accuracy. That doesn't invalidate good-faith attempts at data gathering and modeling, of course, but it does mean that when a transparently interested party on any side of the debate comes calling with a dossier of specific numbers, policymakers are well-advised to start reaching for the salt shaker.

Update: While itBusiness.ca claimed that a government official told it that $15,000 of public money funded the report, a newer report out today in the Globe & Mail directly contradicts that account and says that no public money was spent.