Messiah University Does It Right!: Earning Excellence in Assessment Designation

Messiah University’s Excellence in Assessment designation hardly seemed possible when NILOA first announced the program. Our assessment office came into existence ten years ago coinciding with the institution’s preparation for a Middle States’ self-study. When we started, assessment was a “dirty word” for many of our faculty. In their minds, assessment was linked to faculty evaluation. Only a few of our majors had program-level learning outcomes, and our evidence of student learning was either course grades or outcomes such as job or graduate school placement.
Our first course of action was to work with academic departments to articulate program-level learning outcomes and develop assessment plans for our 90+ undergraduate majors and 12 graduate programs. This was no small feat, as our assessment office consisted of one faculty member with release time to serve as the director of assessment, and an administrator with part-time responsibility to serve as the assistant director. Our mantra in those early years of developing assessment plans was “meaningful and manageable.” We wanted faculty to identify authentic measures of learning that provided meaningful information on student learning within their program. This was also something manageable to all involved, as assessment was an add-on task to job descriptions.
We worked to clarify misconceptions and communicated success stories of programs with creative ideas for managing the new expectations regarding assessment. In addition to supporting department chairs with training and hands-on help, we worked with administration to implement structural support for assessment conversations and reporting.

We also created an assessment committee to provide faculty voice and direction. This body included faculty from each school, along with representatives from our librarians, students, institutional research office, and student affairs professionals. This body provided feedback on how we communicated with the campus community, and provided input on our assessment policies. Its first task was to create an assessment strategic plan to guide our assessment efforts, which our Community of Educators’ Senate discussed and approved.

Our strategic plan directs our efforts, and also provides legitimacy for our work. As neither the Director of Assessment nor Assistant Director have authority over department chairs and deans, we can leverage the strategic plan as the rationale for change. Another important task of the assessment committee was the creation and adoption of an institution-wide rubric to evaluate assessment plans and practices.

In our first few years, our communication with faculty focused on developing a common language for discussing student learning, communicating campus expectations for reporting on program-level learning outcomes, and sharing structural changes relating to assessment. One of those changes was to link program assessment to the curriculum committee. Curricular proposals to create or revise programs require an attached assessment plan scoring at least a two (out of 4) on the institutional rubric to move onto the curriculum committee’s agenda. This provides an incentive for departments to finish or improve plans from their initial attempt. Additionally, faculty serving on the curriculum committee see different examples of assessment plans and take ideas back to their departments.


A second structural change involved our 53 institution-wide learning objectives. Aggregating data to report in meaningful ways on 53 objectives was impossible, as most of the statements were aspirational instead of behavioral. Additionally, most of the 53 statements were double- and triple-barreled, connecting to different learning expressed in another one of the statements.


Our first proposal to revise the 53 institutional learning objectives crashed and burned. We learned a valuable lesson about the disparate ways academic disciplines create, articulate, and measure learning. We regrouped and revised, and our Community of Educators’ Senate approved condensing our 53 objectives into six institutional learning outcomes.
Another aspect of our structural change was adopting a distributed leadership model. We were learning the limits of what we could accomplish as a two-person office, each person carrying part-time loading for assessment and holding no supervisory role over the departments we were charged to assist. We moved the responsibility for oversight to the school deans, embedding assessment into their job descriptions, and collaboratively building an annual assessment process, articulated in an institutional assessment manual.

Moving assessment oversight from the assessment office to the deans served multiple purposes:
1. Deans have legitimate authority over department chairs to influence assessment efforts. While we had access to initiate conversations about student learning, we lacked the power to mandate change.
2. The process drives detailed conversations between chairs and deans about how students experience the curriculum for each major. These discussions enhance the dean’s understanding of programs and encourage conversations focusing on student learning rather than staffing and resources. With increased dean involvement, we were able to increase the acceptable performance on the assessment rubric to a “3” in each category, motivating departments to continue to improve their assessment plans and practices.

In this revised model, the director and assistant director work collaboratively with deans and provide training and resources to departments. One recurring theme in assessment conversations was that our efforts didn’t yet feel meaningful or manageable. A syllabi review confirmed our sense that departments needed help crafting course learning objectives that were specific, measurable, and achievable within the course duration. Additionally, our curriculum mapping had stalled once we linked program learning outcomes to the new institutional learning objectives.
As our culture changed regarding student learning, we felt that we kept encountering the limitations and frustrations of our previous assessment platform. The assessment committee crafted a rubric to evaluate different products on the market, based on our needs. We wanted a product that integrated with our learning management system, helped accredited programs manage program accreditation, aggregated assessment data to program and institutional outcomes, and made it easier to get the data departments needed to do meaningful assessment. After evaluating multiple products, we selected AEFIS.

The decision to partner with AEFIS provided needed momentum to move forward. Setting up AEFIS required departments to :
Review their program outcomes and develop or update existing curriculum maps linking courses to program and institutional outcomes.
Secure dean support, and a call to review and revise all course learning objectives to be specific, measurable, and achievable within the course duration.

While our previous assessment platform functioned more as a repository of results, chairs needed to interact with AEFIS in a different way. Developing the linking between course objectives and program outcomes encouraged a deeper examination of existing course objectives. We again relied on distributed leadership to encourage chairs to get faculty to link their assignments in our LMS to the program outcomes in AEFIS. We provided ongoing training sessions, presented at chairs’ meetings, and improved the resources we provided via our website, which really improved our communication efforts. This process encouraged departmental conversations about which assignments to use, and facilitated conversations between chairs and adjunct faculty regarding learning assessment and critical assignments embedded in their courses.

The Excellence in Assessment designation validates the challenging conversations and work of our campus over the past ten years. Messiah moved from a culture of grudging compliance to a culture focused on examining authentic student performance to drive conversations about improvements in student learning. Birthed during the preparation for our last Self-Study, the assessment office is in a very different role as we approach our next accreditation Self-Study. Our role shifted from being voices in the wilderness charged with an impossible task, to one of facilitator and encourager. Rather than dreading the work involved in a Self-Study, Messiah has meaningful data to support how we accomplish our institutional mission.

How can AEFIS help you? Let's explore!

Become an AEFIS Academy Member Engage with AEFIS Academy in a whole new way. AEFIS Academy Members earn exclusive access to unforgettable events, inspiring conversations with engaging community, and a lot more.
Join
Do Not Show This Again!
It's Your Academy. Help us improve it!
Your feedback is vital to help build the best Academy
— and our community deserves it!

Recent Favorites

View your most recent favorited blogs, resources, events and content hubs.