Field Notes: Clues to assessment: Solving the crime of just a check box activity

May 21, 2018
  • Competencies
  • Data Stewardship
  • Records and Academic Services
  • SEM Assessment
  • Assessment
  • field notes
GettyImages-690284818 "Field Notes" is an occasional Connect column covering practical and philosophical issues facing admissions and registrar professionals. The columns are authored by various AACRAO members. If you have an idea for a column and would like to contribute, please send an email to the editor at connect@aacrao.org.

by Ginnifer Cié Gee, Ed.D, Director of Strategic Planning and Assessment for Strategic Enrollment, The University of Texas San Antonio

Whether it is for accreditation requirements, divisional mandates, or department commands, assessment of programs and services is part of our world. I am new to the strategic planning and assessment realm, so the following advice is coming from a novice perspective.  Those who are experts in assessment may read this and find it “elementary, my dear Watson.”  But if you are in Sherlock mode, investigating and sleuthing to understand how to crack the code on improving assessment, it is my hope that the following will help you solve the mystery.  

My Story
My first act in my new role was to gauge the assessment climate. I wanted to understand the commitment level of department leads who were responsible for metrics.  Therefore, I sent out a survey asking two simple questions.  
1. Do you feel that your documented assessment data accurately represents your department as a whole?
2. Do you feel that your department uses your assessment data to make program and service improvements?
About half of the responses to the above two questions was “No."

"No?"  What was the cause of this delinquency? I was intrigued and headed off to find the culprit.

Clue #1 – Lack of effort
The answers to the above questions were not only a simple "no;" in fact, many responded with the incriminating admission, “It’s just something we have to do.”  The last time I checked, people in our field were not excited about doing a laborious process that had no real benefit to their office. And yet that was exactly what was happening. 

Yes, they may have been measuring a program or service, but little thought or time was spent analyzing this data and strategically planning how to move forward.  It was in essence a box to check.

Clue #2 – Lack of direction
Taking this information, I decided a re-fresh was needed.  I called a meeting to discuss assessment and have an honest conversation about what was needed from me as the ever-vigilant "assessment leader."  Our university has undergone a lot of change very rapidly and I discovered people were struggling to identify what they should be assessing.  What was the mission?  How does our department fit?  What should we be measuring that really matters?

Clue #3 – Fatigue from being a lone, lonely loner
Department assessment leads also expressed a desire to become less isolated in assessment.  During the meeting, we did a group activity where different departments discussed and collaboratively critiqued each other’s assessments. This was wildly productive.  Below is a paraphrased quote from one participant:

“When only your department is looking at your assessment tools, goals, outcomes etc., it’s really easy to overlook gaps or ambiguity: It makes sense to me. But when I was explaining my assessment to someone outside my area, they said things like 'why are you doing that?,' 'I don’t understand how this measures that goal,' and 'This isn’t really clear language,' and that feedback was extremely enlightening to me.”

Other realizations were that some departments were assessing similar programs or services and could share resources; others had suggestions or instruments that could be shared.  The consensus was to have more collaborative interactions.  

Clue #4 – Confusing assessment database
Not everyone was comfortable with the assessment depository tool that we use.  Instead of asking if they liked the database, I asked what would make it better.  More training was requested as well as examples of well written goals, outcomes, and action steps.  Staff wanted qualitative and quantitative assessment examples set up in the database to use as a reference or template.    

Case closed? 11 considerations
I cannot say that this case is closed, but it is well on its way to being solved.  With a bit of investigation and interviewing of eye witness accounts, I was able to identify the thieves of productivity.  The plan moving forward is to have several collaborative workshops, with institution initiative discussions, and database refreshers. Below is a summary of the do’s and don’ts that I have learned during this process:

The Don’ts
1. Do not assume that everyone understands why they are assessing programs and services.
2. Do not assume that everyone understands how to assess effectively. 
3. Do not assume everyone understands your assessment database.
4. Do not believe that all assessments are being reviewed for process improvement.
5. Do not be afraid to have honest conversations about poorly constructed. assessments/goals/outcomes/actions…as long as you can provide a path to solutions.

The Do’s
1. Create a forum to have conversations about your division/institution assessment process.
2. Take a collaborative approach, even if people start off by lamenting together, it begins conversations that lead to improvements.
3. Departments can be diverse in services and programs. Make sure assessment is inclusive of the entire department, not just that one area that gives a survey.
4. If needed, change what you are assessing.  If the institution has changed, you need to adjust.
5. Ask for help.  
6. Stay positive; negative talk poisons.  One area’s deficit is not generalizable to the whole division or institution.  Realize that many factors could have contributed to the 'check-the-box' mentality: staff turnover, restructuring, increased responsibilities. Encourage departments with a good handle on the process to mentor others, creating a community of support.