The Ides of March is approaching, and that means one thing for law schools: It is time for career services offices to turn into sleuths and archivists. All law schools must report very detailed employment data on their recently graduated class, and March 15 is the magic date to which this data is keyed. How detailed? Law schools must report, as a separate data entry, each student, the name and address of the employer, and the nature of the position, and they must keep careful records of how they came to have learned all this information.

So how does the school get this information? Well, first it reaches out to each of its graduates and asks them to provide the information. Now remember, these are graduates who left school 10 months ago. They have generally moved out of their student accommodations and are not necessarily regularly checking emails from their law school. Of course, in today’s internet age, people can be tracked through LinkedIn or Facebook, and if that doesn’t work, there is the old-fashioned detective work of asking faculty, staff, and other students. But all of this is time intensive – and once you locate your grads, you have to get them to give you the necessary information. Graduates have no obligation to share information about their employment status, and some are not inclined to do so. And because only direct information from the graduate, direct information from the employer, or publicly available information is considered reliable, the sleuthing does not always end with locating the students.

But finding out where every graduate is employed is only half the job – the other half is documentation. Schools are expected to be meticulous because their records may be audited. So suppose a student tells the career office that she is employed at a particular firm, then what? If the information came by email, the email must be uploaded to the student’s file. If the information was communicated orally, the staff person must document that conversation and put that in the file. Suppose the graduate provides the employer’s name but not the address of the firm: The career office must have someone go to the web, find the address, take a screen shot that shows the address, and upload that to the student’s file.

All of this takes a lot of time and staff resources. Between mid-January and mid-March, one of our full time career counselors spends about two-thirds of her time on data collection and reporting. This is for a graduating class of about 150. The time she spends on data and reporting is time she will not spend with our students and graduates helping them identify job opportunities, reviewing their resumes and cover letters, or preparing them for their interviews.

Of course, prospective students care about employment outcomes and should have reliable information about this. But the level of detail and documentation far exceeds the level of detail we must provide about any other aspect of our operation, and it is time to restore some balance.

Last week, in a moment of commendable candor, Elizabeth Parker, executive director of the California Bar, told law makers that “there is no good answer” for why California has set its bar exam cut score at the level it has. What should California do? One approach would be to simply lower its cut score to, say, the median cut score for all other states, and in the short run, that might be an appropriate response.  But let’s remember that California is not alone in having “no good answer” for why it set its cut score at the level it did.  One can hope that the focus on California may bring much needed attention to the broader issues surrounding the bar exam.

Last week, the ABA House of Delegates rejected a proposal that would have imposed on all law schools a requirement that 75 percent of their graduates pass a bar within two years of graduation. Of course law schools should adequately prepare their graduates to become licensed lawyers, so why would anyone object to this rule?

At least a part of the answer is that there is growing frustration with the lack of transparency in the bar exam and the failure of state bar examiners to explain and justify both the nature of the exam and what is set as a passing score.

There are a couple of oddities about the bar exam. First, although the Multistate Bar Exam (MBE) is a national test – uniform throughout the country, except in Louisiana– what constitutes a passing grade on this pass/fail test is not uniform, but is set separately by each state. The MBE is scored on a 200 point scale and the passing scores (or “cut scores” as they are sometimes called) range from 129 in Wisconsin to 145 in Delaware. Recently, there was much coverage of California’s low bar passage rate – but this was largely a function of its high cut score of 144.  Indeed, the graduates of California’s ABA schools performed better than the national average on the MBE: For the July 2016 bar exam, nationally, the mean MBE score was 140.3, while in California the mean for ABA accredited scores was 145.7.

Second, notwithstanding the assertion of the National Conference of Bar Examiners (NCBE) that the bar exam “should not be designed primarily to test information, memory, or experience,” there is little doubt that information and memory are at the center of the test. For example, when civil procedure was added as an MBE subject, one of the sample civil procedure questions provided by the NCBE turned on remembering the number of days one had under Rule 38(a) to request a jury trial. Of course, a competent lawyer should probably know that this is a subject that is specifically covered by the Federal Rules of Civil Procedure, but a competent and careful lawyer should also check the rule itself and should never rely on her memory as to how many days one has to request a jury.

If anyone believes the bar exam does not depend primarily on information and memory, then I offer the following challenge. The next time the exam is given, let’s randomly select a group of licensed lawyers whom everyone agrees is competent and have them take the bar exam without spending eight weeks cramming. If the bar examiners are correct that this is not about memory and information, then any group of competent lawyers ought to be able to perform well on the exam. Indeed, maybe we could use the score of this randomly selected group of lawyers as the basis for establishing an appropriate cut score, since that would at least link the cut score to some demonstration of competence.

Finally, the Code of Recommended Standards for Bar Examiners of the NCBE provides that the purpose of the bar exam “is to protect the public, not to limit the number of licensed lawyers admitted to practice.”  That is the aspiration, but we have to recognize that the bar exam, like all licensing requirements, is a barrier to entry. And when we see the wide variation in cut scores that exists among the states, we ought to at least ask whether the practice of law is so very different among the states to warrant that differential level of exclusion from the profession.

Of course, the solution is not to dispense with all criteria for lawyer licensing or law school accreditation. But particularly in light of the California example, we have an obligation to take a more comprehensive look at the bar exam and ways of assuring that all licensed lawyers meet the necessary standards of competence without unnecessarily excluding those who can make a valuable contribution to our profession.