By Steven Brill

A Convenient

Re-Analysis

New York Hotel Meeting June 14, 2002 (p. 71-75) Six days before Austin Pledger swallowed his first Risperdal, Janssen scientists and marketing executives met with an advisory board of doctors in a luxury hotel suite in New York. The group wrestled with problems concerning the prolactin and gynecomastia data that had come in from the clinical study Gorsky and his team had ordered up, hoping to put the issue to rest. Description of Studies (p. 63-67)

Testimony re: Study of Studies (p. 67-72, 74-75) This new study was actually a study of studies. It pooled the one study called “INT-41”—which had the largest number of participants and the worst results and had devoted what those who conducted it called “special attention to prolactin”—with four smaller, more general studies that had produced less troubling numbers. Although this approach diluted the bad news for Janssen, there were still two problems. First, the gynecomastia rates remained high. Second, one table showed a statistically significant relationship between elevated prolactin and breasts among boys who had been taking the drug for eight weeks. In other words, it looked like causation had been established. According to later testimony, at that meeting, the doctor advisors and the Janssen team came up with a solution that, they decided, could remove many of the gynecomastia cases but in a way that was scientifically legitimate. Testimony About the Data “Re-Analysis” (p. 67-74) There would later be bitter disputes in court about whether it was the outside doctor-advisors or the Janssen people who came up with what they thought could be a defensible way of doing what notes of the meeting called a “re-analysis” of the data. But everyone in the room was being paid by Janssen, and there can be no dispute that the method they devised would make Janssen’s numbers look a lot better. Nor was there any dispute that the idea of re-analyzing the data only came up after they had seen the initial negative numbers. The retroactive redesign of the study began when someone pointed out that the children in the group who were 10 or over were likely to be going through puberty. Therefore, their hormone levels, including prolactin levels, were likely to be elevated. So why not remove them from the count of gynecomastia cases? The group agreed to see how that “re-analysis” affected the numbers.

The Phony Denominator

Two months later, on August 22, 2002, a revised version of the all-important study was circulated among Janssen development executives. The result was a table showing a much lower rate of gynecomastia—just eight-tenths of 1 percent. Moreover, the comparison of boys with raised prolactin levels who had ended up with the disease was now no longer statistically significant. Proving that that relationship was statistically significant – or, rather, that it wasn’t – was the key purpose of the study.

You only need to have gotten past a third-grade math lesson to understand how scientists from the world’s leading health care company and its hired-hand doctors distorted complicated clinical findings.

The re-analysis had worked. The data that Gorsky and his team had envisioned nearly four years earlier to rebut competitors’ claims about gynecomastia was finally ready. However, even assuming the legitimacy of removing the boys who were 10 years old and over, the table produced numbers created by an obvious arithmetic sleight of hand. You just have to have gotten past a third-grade math lesson in numerators and denominators to understand how this group of scientists from the world’s leading health care company and its hired-hand doctors distorted this series of complicated clinical findings and dense set of the data. The cases of boys 10 and over with gynecomastia had been eliminated from the numerator—the group in the table that counted those suffering from gynecomastia. However, all the children, no matter their age, were still counted in the denominator. In other words, five boys under 10 years old had been shown to have developed breasts, but all 592 children—over and under 10—were included in the total to tabulate the percentage: five is 0.8 percent of 592. However, only 358 of the children were under ten. Thus, the supposed 0.8 percent represented 0.8 percent of all 592 children, but the real number—the real denominator—should have been 358, which is the number of children under 10. That would have yielded a percentage of 1.4 percent, not 0.8 percent, because five is 1.4 percent of 358. In fact the real percentage should have been derived from the percentage of the number of boys, not boys and girls, under 10 with breasts, or 255. And five is 2.0 percent of 255, a number that likely would have gotten the attention of Benita Pledger and her doctor. And, again, that assumes that retroactively removing the boys 10 and over was justifiable, which those who had originally designed the study had not assumed. Had those boys not been removed, the percentage of all boys with gynecomastia would have been 4.5 percent: 22 cases out of 489 boys. A J&J witness in a case brought by a boy who had developed male breasts later attempted to offer a rationale for including the original denominator, but even one of the doctors involved in the study would later concede that the denominator should have been changed.

J&J's ‘Re-Analysis’ of the Data

More important to the statisticians worrying about a statistically significant cause-effect relationship between raised prolactin levels and gynecomastia, the new table obscured a deeply troubling finding: that when all the boys who had been taking Risperdal for eight to 12 weeks were examined, those with raised prolactin levels tracked 98 percent of the time with those suffering from gynecomastia. That eight-to-twelve week treatment time period, in fact, was consistent with medical theories that lawyers in suits against Johnson & Johnson would later introduce—that the gynecomastia didn’t take hold and become permanent until the breast tissue fiber generated by the prolactin had been given time to grow. Findling Article Nov. 2003 (p. 1-2, 5, 7-8) After going through various drafts and table reformulations, this re-done study, with the fictional denominator and without the table showing that statistically significant relationship, is what Dr. Findling and two other academic luminaries would ultimately attach their names to as co-authors. Also listed as co-authors would be three Johnson & Johnson employees—two doctors and Carin Binder, the executive spearheading the publication of the article. The article disclosed the affiliations of all of the authors and that a J&J Canadian subsidiary had “supported” their research. But that raised no eyebrows in the academic medical community because by now most such published research was paid for by the drug companies whose products are its subjects.

A Janssen executive complained that a J&J-funded article about Risperdal contained a “nauseating amount of information” about side effects.

The study’s findings would immediately be circulated to the sales teams in the field and be formally published in the highly respected Journal of Child Psychiatry under the title, “Prolactin Levels During Long-Term Risperdone Treatment in Children and Adolescents.” Testimony re: Binder’s email (p. 53-57) The original table that did not eliminate the boys ages 10 and older—and that was stipulated in the original protocol for the study—was removed after the first draft. It was then put back in the final draft of the article after some of the doctors asked that it be reinstated—over the complaint of Janssen’s Binder, the article’s only non-doctor, who wrote in an email to the team of Janssen marketers and scientists that it contained a “nauseating amount of information” about side effects. However, that table was given little discussion in the text, except to explain why including boys 10 and over made the table unimportant. Moreover, the specific chart of data showing the statistical significance for the eight-week treatment period was neither mentioned in the text nor shown in any table. Scholarly articles in medical journals always include an abstract at the top, so that doctors can glean the gist. The abstract of what would become known in court as “the Findling article” declared, without qualification, “There was no direct correlation between prolactin levels and [side effects].”

Pay No Attention to the Label