A licensing process is valid only if it measures the knowledge and skills that new practitioners need for minimum competence. Psychometricians follow well established processes to create valid licensing processes. Those processes center on job analyses: empirical studies of the work that new practitionrers do, as well as the knowledge and skills they need for that work. The processes also include consideration of the knowledge and skills taught in professional schools. If the curriculum aligns with the competencies needed for practice, then a licensing exam may build upon the curriculum.
Neither the Uniform Bar Exam (UBE) nor any of its components (the Multistate Bar Exam, Multistate Essay Exam, and Multistate Performance Test) rely sufficiently on practice analyses. Nor do bar exams drafted by state examiners rest sufficiently on those analyses. NCBE, which creates the UBE and its components, published its first practice analysis in 2012. That analysis revealed significant gaps between the knowledge and skills that new lawyers need and those tested on the UBE, but NCBE did not close those gaps.
The legal principles tested on the UBE and other bar exams track required subjects at most law schools, but that alignment does little to improve the validity of the exams. Practitioners frequently criticize law schools for failing to prepare graduates adequately for practice. If the professional curriculum does not align with practice, then a licensing exam based on the curriculum will also fail to align properly.
Bar examiners sometimes point to correlations among bar exam scores, LSAT scores, and law school grades as evidence of the bar exam's validity. Neither the LSAT nor law school grades, however, have been shown to predict competence in law practice. These correlations, therefore, fail to improve the bar exam's validity.
Recent research has illuminated the knowledge and skills that new lawyers need to practice with minimal competence; those studies provide a foundation for defining minimum competence and designing valid licensing systems. NCBE is relying upon that research to design a new bar exam, which will be available in 2026. Meanwhile, current exams remain insufficiently aligned with evidence-based definitions of minimum competence.
See the articles cited below for more information about problems with the validity of the current bar exam.
Steven Foster administered a simulated Multistate Bar Exam (MBE) to sixteen practicing lawyers. The simulated exam used questions developed by BARBRI, a company that has demonstrated high correlations between performance on its practice exams and performance on the actual MBE.
Foster's subjects obtained low scores on this simulated exam: their percentage of correct answers ranged from 26% (barely better than guessing) to 52%. Those percentages would be unlikely to generate a passing score in any jurisdiction. BARBRI's experts estimate that candidates need to score at least 55% correct--and, more likely, 60%--to earn a passing score in a jurisdiction that uses the most common cut score of 135.
If the bar exam is a test of minimum competence, why can't licensed practitioners pass it? Foster's results strongly suggest that the current exam is not a test of minimum competence. Instead, it tests memorization of detailed rules that lawyers soon forget. Notably, many of Foster's subjects obtained low scores even on questions limited to the field in which they practiced.
Foster concludes: "The legal profession has an obligation to prevent incompetent practitioners from harming clients, but it has an equally weighty responsibility to assure that its test of minimum competence validly measures the knowledge and skills that new lawyers need to serve those clients. The research reported here suggests that we are falling short of that goal."
Stephen Foster, Does the Multistate Bar Exam Validly Measure Attorney Competence? (2021)
If the bar exam validly measures competence to practice, then scores on the exam might show some relationship with attorney disciplinary measures. Attorneys with lower passing scores, in other words, might be more likely to suffer disciplinary complaints or penalties than attorneys with higher passing scores. Scholars have been able to test this hypothesis by looking at the relationship between state cut scores and disciplinary rates. States use a variety of cut scores to determine who is competent to practice law. Examining the relationship between cut scores and disciplinary rates, therefore, offers some insight into whether the bar exam protects against the type of poor service that leads to disciplinary complaints or action.
The most comprehensive study of this type was published by a team of five researchers in 2020. Their analysis, based on six years of data from up to 48 United States jurisdictions, found no statistically significant relationship between cut scores and the number of complaints filed by the public, the number of formal charges, or the number of attorneys disciplined in that jurisdiction. Pp 26-28, Although the authors acknowledge several caveats, the results suggest that there is no clear relationship between bar exam scores and lawyer discipline. This, in turn, suggests that the bar exam may not adequately measure the knowledge or skills needed to avoid attorney misconduct.
Mitchel Winick, et al.,, Examining the California Cut Score: An Empirical Analysis of Minimum Competency, Public Protection, Disparate Impact, and National Standards (2020)
An updated discussion of this study was published in 2021, drawing similar conclusions.
Michael Frisby, Sam Erman & Victor Quintanilla, Safeguard or Barrier: An Empirical Examination of Bar Exam Cut Scores (2021)