- First, of course, there's U.S. News, which isn't alternative at all. U.S. News takes into account: a peer assessment score, a lawyers/judges assessment score, a median LSAT score, a median GPA score, acceptance rate, employment rates, bar passage rates, expenditures per student, student/faculty ratio, and library resources. This year, statistics for part-time programs were added to this mix.
- Next, there's the Brian Leiter rankings. This year Professor Leiter focused on peer assessment and lawyers/judges assessment, noting that only 70% of peers and 31% of lawyers/judges responded to the U.S. News surveys.
- Then, there's the Liberty and Light rankings, also known as the 53% rankings. These rankings take into account a peer assessment score, a lawyers/judges assessment score, a median LSAT score, and a student/faculty ratio score. Notably excluded are median GPA, acceptance rate, employment rate and bar passage rate, since these are not really comparable from school-to-school or state-to-state.
- You could also check out the Hylton rankings, which take into account only the peer assessment ratings and a median LSAT score. Looks pretty bare bones, but the assertion that all other stats used by U.S. News are arbitrary or redundant, does seem to have some merit. I haven't seen a 2009/2010 version of the Hylton rankings yet; the link above is to 2007/2008 version.
- Well, maybe the Hemholtz rankings (.pdf) are more up your alley. The Hemholtz rankings factor in a peer assessment score, a median LSAT score, and a median GPA score.
- If you really like self-serving rankings, check out the Thomas Cooley rankings. The Thomas Cooley rankings take into account over 32 variables, the two most important being the size of the school and whether or not the school is named Thomas Cooley School of Law. Amazingly, Thomas Cooley School of Law still came in at 12th here, a 169 spot difference from its U.S. News ranking, but a surprisingly low showing considering.
- Last, but not most ridiculous, is the Google American Law School rankings. This system takes into account which schools come up first on a Google search for "law school." I know this sounds unscientific, but don't worry, they used a "clean browser" and "appropriate filtering," so the results are about as methodologically sound as the U.S. News results.
GPA standards vary by undergrad school, so GPAs are not a reasonable way to compare law school quality. Bar passage rates vary by state due to differences in bar exam difficulty, so they are also not a reasonable way to compare law school quality. I don't see the relationship between acceptance rates and law school quality either. Factoring in library resources seems particularly ridiculous considering most legal research happens on online databases these days.