Let me give you an example. Let's take Florida Coastal School of Law's (surprise) 509 Report. The 2014 report, the report that students who are thinking about applying to law school for Fall 2015 have access to, provides the bar passage rates for first time takers for 2011, 2012, and 2013. Looking at the numbers from those years, the school's bar pass numbers look pretty decent, 76% in 2011, 75% in 2012, and 68% in 2013. Looking at the 509 report, a prospective student would likely conclude "if I am admitted to this school, I will have a roughly 70-75% chance of passing the bar the first time." That prospective student might consider this to be a reasonable risk, and plunk down a deposit. (Of course, prospective students tend to disregard the information about attrition, even though it is presented on the report. No first-semester law student that I ever met ever seemed to seriously contemplate the possibility that they might flunk out.) But, as with mutual funds, past results are not a guarantee of future performance. As I have explained in other posts, where a law school has substantially lowered the entrance credentials required for admission, it can reasonably be expected that the school's bar passage rate will drop significantly three to four years later when those full and part-time students graduate and take the bar. Thus, in the case of FCSL, the lowering of admissions standards in 2010 resulted in a lower bar pass rate in 2013, and the further weakening of admissions standards in 2011 resulted in an even lower rate this year (58% for first-time takers in Florida on the July 2014 bar). The further weakening of admissions standards in 2012-14 is likely to result in further deterioration of the first-time bar passage rate at FCSL But the college senior or recent graduate contemplating law school doesn't know all that. I can tell you with great certainty that the entering students with a 2.6 UGPA and a 140 LSAT (these are the 25th percentile averages for the most recently reported entering students at FCSL) don't have anywhere close to a 70% chance of graduating and passing the bar on their first attempt, but what exactly are there chances? Presumably FCSL knows, but they sure aren't telling. Wouldn't it be nice if they were required to disclose this? The current Standard 509 doesn't require anything like this level of granularity. I think that it should.

The ABA Standard 509 Information Report was a major step forward for law school transparency. (Note - the reports on all ABA-Accredited law schools can be found here ) The reports provide a wealth of useful information about the entrance credentials of admitted students, attrition data, conditional scholarships eliminated, and bar passage data, allowing prospective law students to make an informed comparison of law schools. Although these reports are useful, in my view they don't go far enough, and they even have the potential to mislead students about their prospects for success. The problem with the reports is twofold: first, there is a lag in reporting; and second, the data doesn't differentiate among the success rates of students based on their entrance credentials.

A Modest Proposal

I propose that each law school be required to have a calculator on their website which would provide customized, tailored predictors of success to all prospective students. Each law school would be required to maintain a master database which tracked every law student who matriculated. The database would include the students undergraduate GPA (UGPA) and LSAT score. The database would track whether the student was academically attrited, voluntarily left school, transferred to another law school, or graduated. The database would also track each student who reported taking the bar and whether they passed on their first or a subsequent attempt. Of course, law schools are already collecting most, if not all, of this data already. What I propose is that this data be made available to prospective students through the personal success calculator. Here's how it would work: the prospective student would plug in their UGPA and LSAT score into the calculator, and the school's website would then provide a customized personal report describing the experience of similarly qualified students, which I would define as those within +/- 1 point on the LSAT and +/- .10 UGPA. So, if a student entered a UGPA of 3.0 and and an LSAT of 150, the website would provide the following information:

"Over the last 7 years, we have matriculated x# of students with similar entrance credentials to your own, defined as those with a UGPA of 2.9 to 3.1 (+/- .10 from your self-reported UGPA) and an LSAT of 149-151 (+/- 1 point of your self-reported LSAT score).

Of these x# of students, # were academically attrited (failed), # voluntarily dropped out, # transferred to another ABA-accredited law school, # are still enrolled (as of the beginning of the most recent semester) and # have graduated, earning their Juris Doctor degree. Of the # that graduated, # reported taking the bar at least once. Of these #, # passed the bar on their first attempt, for a first-time bar passage rate of x%. An additional # who failed on their first attempt passed on a subsequent attempt."

In addition to making this data available on their websites, this individualized data should be required to be included in all offers of admission sent to any applicant. Thus, a student admitted to multiple schools could actually compare how students with similar entrance credentials have fared at each of the schools to which they were admitted, providing the prospective student with some meaningful basis for choosing among competing offers of admission. The raw numbers would also provide some potentially useful information. For example, schools that were admitting students with lower entrance credentials than they had accepted in the past might have little, if any, data on the success rate of students with similar entrance credentials. Knowing that a school had little, if any, experience, in helping students at their talent level to succeed in law school would be very useful for an applicant to know.

Let me give a hypothetical example to illustrate. Suppose a student with a 140 and a 2.6 recently admitted to a law school went to the school's website and plugged in their numbers. The report generated might look something like this:

"Over the last 7 years, we have matriculated 100 students with similar entrance credentials to your own, defined as those with a UGPA of 2.5 to 2.7 (+/- .10 from your self-reported UGPA) and an LSAT of 139-141 (+/- 1 point of your self-reported LSAT score). Of these 100 students, 30 were academically attrited (failed), 6 voluntarily dropped out, and 1 transferred to another ABA-accredited law school, 33 are still enrolled (as of the beginning of the most recent semester) and 30 have graduated, earning their Juris Doctor degree. Of the 30 that graduated, 28 reported taking the bar at least once. Of these 28, 12 passed the bar on their first attempt, for a first-time bar passage rate of 43%. An additional 2 who failed on their first attempt passed on a subsequent attempt."

This information would tell the prospective student that the school has only recently started taking significant numbers of students with similar entrance credentials, and that students like them tend to do quite poorly, with a high academic attrition rate and a low first time bar passage rate. These numbers are likely far worse than the school's overall bar pass and attrition rate as reported on the 509 Report, which might be perfectly respectable. Hopefully, this stark data would cause the prospective student to think twice about enrolling in that law school. But even if the student chose to defy the odds and attend this law school, at least it could be fairly said they were making an informed decision.

Requiring that law school's track this data and have it available for accreditors would also enable the ABA to determine whether schools were meeting standard 501(b), which states: