•  
  •  
 

Publication Date

3-2020

Document Type

Article

Abstract

The American Bar Association (ABA) accreditation standards involving outcome-based assessment are a game changer for legal education. The standards reaffirm the importance of providing students with formative feedback throughout their course of study to assess and improve student learning. The standards also require law schools to evaluate their effectiveness, and to do so from the perspective of student performance within the institution’s program of study. The relevant question is no longer what are law schools teaching their students, but instead, what are students learning from law schools in terms of the knowledge, skills, and values that are essential for those entering the legal profession. In other words, law schools must shift their assessment focus from one centered around inputs to one based on student outputs.

Compliance with the ABA’s assessment mandate comes at a time when law school resources are spread thinner than ever. Indeed, faculty already work with plates that are full with students, scholarship, and service. Thus, while not all in the legal academy are on board with the ABA’s approach to outcomes assessment or to outcomes assessment generally, as busy educators, we should all at least agree that the requisite response should be efficient, given that resources are limited, and meaningful, such that the work done can benefit our learners. To do so, law schools should begin at their own tables set with full plates, so to speak, taking stock of what institutions and their faculty are already doing in terms of assessment. And it is important to think broadly here, as faculty may be surprised to learn how many of their colleagues are already doing relevant work.

While law schools may already be inclined to begin from within, this Article outlines concrete strategies they can use when working with existing faculty expertise and resources to respond to the ABA’s assessment mandate in a meaningful way for students, and with the goal of maximizing efficiency and gaining broad buy in. While prior scholarship has outlined best practices for outcomes assessment and even shared examples of how to engage in the process in the law school setting, this Article is unique in its depth and breadth of coverage by setting out a detailed case study that illustrates the process of developing an authentic assessment tool and beginning the process for adapting that tool to respond to both the individual student assessment and law school assessment required by the ABA.

To be clear, this Article does not suggest that only those with existing expertise or resources should be the ones to actually engage in the outcomes assessment work now required by the ABA. The goal should not be to add to the plates of a few. Instead, to create a productive and meaningful culture of assessment, experts in the field proclaim that administrators and faculty must all be involved. The ABA agrees. In addition to encouraging broad buy in, a more collaborative approach helps ensure that assessment work is equitably spread among faculty.

Part II reviews the ABA standards relevant to outcomes assessment, discussing the two types of outcomes assessment required by those standards—individual student assessment and law school assessment— and sharing the underlying theory behind both. Part III outlines the stages of outcomes assessment, with a specific focus on the measurement stage of the process, because it is arguably the most time-intensive stage of the process and the one in which existing resources can prove most valuable. Part IV focuses on one common direct assessment measure, the analytic rubric, detailing how UK Law’s legal writing faculty collaboratively designed a rubric for the LRW Course appellate brief assignment, and responding to concerns that have been raised about using rubrics for assessment. Finally, Part V provides specific suggestions on how to adapt and use existing assessment measures most efficiently when responding to the ABA’s assessment mandate at both the individual student and law school levels. In other words, assessment measures, like the rubric project described in Part IV, can be adapted and used more broadly than the purpose for which they were originally designed. While the LRW Course appellate brief assignment rubric serves as the primary example to illustrate these ideas, this Article will touch on other examples and share ideas about how a variety of existing resources can transfer to the current assessment landscape mandated by the ABA.

The message here is that law schools need not panic, as they are likely to find they have more relevant assessment knowledge and materials to work from than first thought. If professors are willing to share their relevant experience and resources, work collaboratively to expand and adapt from that base as needed, and spread the related assessment responsibilities widely and fairly among the faculty, then the ABA’s call for outcomes assessment can be answered with meaning and without forcing any one faculty member’s plate to overflow.

Share

COinS