I am presenting at an assessment conference on March 26th and hence needed to have a presentation. As an epic challenge, I tried to work some philosophy into assessment. Here is the first part of the presentation:
The Challenge
One fundamental challenge of assessment is earning faculty buy in for the process. Failure to achieve this can have a range of negative consequences. One area of negative consequences is in the realm of data. If faculty buy-in is not earned, they are more likely to provide incomplete assessment data or even no data at all. They are also more likely to provide low-quality data and might even provide fabricated data to simply get the process over with. De-motivated faculty will tend to provide garbage data and, as the old saying goes, garbage in, garbage out.
A second area of negative consequences is in closing the assessment loop. Even if faculty provide adequate data, without buy-in they are more likely to neglect the other parts of the process such as their improvement narratives, reflections, and applying these results to their classes. Because of this, earning quality faculty buy-in is part of the foundation of assessment. Fortunately, there are ways to help earn the participation of faculty in the process and these include the SAM method. This involves Simplifying the assessment process, Automating the assessment process, and Motivating faculty. I will begin with Simplification.
Simplification
A complicated assessment process is analogous to the tax code or the Windows Registry. This is to say that it is problematic, convoluted, torturous, difficult, and inconsistent. Dealing with such a process often requires special knowledge of all its difficult ways. Even with such knowledge, errors are likely and there are often punitive aspects to such processes that can create adversarial relationships. Complicated processes often have a random element as well—one can never be quite sure how the process will work this time around.
As a rule, people find complicated systems a deterrence to participation. They can be challenging to understand and typically impose an unnecessary cost in time and resources on those involved. As such, people generally try to minimize their involvement with such systems. Simplifying and streamlining the faculty aspects of assessment makes these aspects easier to understand and lowers the cost of participation. Doing this increases the likelihood that faculty will buy into and participate more willingly in the process. Effective simplification can also improve the quality of assessment by focusing faculty effort onto key areas of assessment so that their time and resources are not wasted.
While merely reducing the size of something need not make it simpler, the process of simplification often has the virtue of reduction—which can also be beneficial. As an example, in 2018 Florida A&M University’s General Education Assessment Committee’s faculty data contribution guide was 13 pages long and the data collection forms were four pages long. The forms were also somewhat complicated, with numerous check boxes to choose between and many text boxes to be filled out—one faculty member said they reminded them of tax forms. That was a clear red flag—if you are being compared to the IRS, then you know that the faculty are not thrilled with the process. While the detailed guide was retained after some modification (for those faculty who wanted such a guide) a short and simple video focused on the essentials was created to guide faculty through the process. While it will never go viral, it did prove more popular than the PDF guide.
More importantly, the forms were simplified down to the essentials. This was done through a review process in which the committee carefully considered the distinction between essential and non-essential information. The online version of the forms allowed for even greater simplification since irrelevant questions would not be visible, making the basic form simple—not like a tax form at all. This simplification has also helped improve faculty participation in the data collection process.
While a systematic guide to simplification is beyond the scope of this work, philosophy does provide some excellent general principles for this process. Our good dead friend Aristotle provides a good starting point for simplification: ask whether a specific part of something contributes in a positive way to that thing. If not, remove it, “For a thing whose presence or absence makes no visible difference, is not an organic part of the whole.” In the case of parts that have a clear negative impact on the process, those should be the first removed. As such, the practical test of the importance of some aspect of a thing is to consider the consequences of its removal. Because of the nature of bureaucracy, it can almost always be improved by removing parts. Faculty feedback can be helpful here—it is a good idea to ask faculty about the process and take serious suggestions about simplification seriously. While simpler is often better, simplification is not without its hazards.
There is always the risk of oversimplifying the assessment system or some aspect of it and this can have the negative consequence of making the assessment less useful. Consider, as an example, the above-mentioned forms used to collect General Education data data. While a simplified form makes it quicker and easier for faculty to provide data, this comes at the cost of not gathering as much data as the original form. In the case of these forms, every item removed was data that was not gathered.
In terms of a general guide to what to keep and remove, Aristotle and Confucius’ advice should be taken when simplifying: one must find the mean between the two extremes. That is the mark of virtue. Using the form example, a well-designed data collection form is a balancing act between having a form robust enough to get the data that is needed and simple enough to not invite comparisons to tax forms. Ideally, the form should have the right questions, at the right time, for the right reasons and presented in the right way—to borrow and modify Aristotle’s notion of determining virtue.
A consequentialist approach serves well here: each addition to the system of assessment should be assessed in terms of its costs and benefits. This assessment should occur on both an individual level and in total. To illustrate, while each individual part taken in isolation might create more benefit than cost, the cost of the entire system could exceed its benefits. As an example, consider data collection forms. In general, having a relevant piece of information about student performance is more beneficial for assessment than not having that information. On this approach, almost any sensible data collection entry on a form would thus have more benefit than harm when assessed in isolation. But adding all these questions to a form would have an overall negative consequence: the form would be huge and require considerable effort on the part of faculty to complete. While this form would yield a bounty of information, faculty would be less inclined to complete it and not appreciate having to waste so much time on such a needlessly long form.
It should be noted that something can be sophisticated without being complicated. Such a thing is like the original iPod Nano. It is consistent, straightforward and ‘user friendly’ while also being sophisticated and useful. Unlike the complicated system, interacting with such a system would not require special knowledge of its complicated ways. As such, a simplified system need not be simplistic—it could be quite sophisticated. One way to operate a sophisticated system with greater simplicity and ease is with automation.
[…] Source link […]