Best Practices Guides: Assessment is a process, not a product.

This content is only available to AEFIS Academy Community Members.

By Dr. Alyce Odasso, Program Coordinator of Assessment
Office of Effectiveness and Evaluation, Texas A&M University


Overview: At AEFIS, we realize that assessment functions are not one size fits all!  Although stakeholder end goals may be similar, assessment protocols and processes differ across universities, colleges, and programs.  AEFIS Data Collection Solution and Customized Workflow Best Practices Guide provides insight on how to consider using workflows to collect and report assessment data for many of your assessment processes including, but not limited to, program assessment, assessment plans, core curriculum assessment and self-study reports.

At Texas A&M University, the Data Collection solution is used to document their annual assessment process. Based on the good work from our partners at Texas A&M University Office of Institutional Effectiveness and Evaluation, here are some best practices and examples to support your implementation of automated assessment collection protocols to fit your unique needs.

Best Practices Guide:

  1. Use workflows to create a meaningful assessment cycle for your institution
    • Consider using workflows to formalize the assessment cycle. The workflow steps can be used to help institutionalize the assessment process from planning to reporting to closing the loop.
  2. Consider multiple rounds of feedback
    • If you’re using AEFIS Data Collection for reviewing assessment plans, consider providing feedback at different stages of the process (e.g., Plan and Report) and/or from key players at different levels (e.g., college assessment professionals, department heads, reviewers at the institutional level).
  3. Engage key players: Embed different roles in the workflow
    • Consider who should be involved in the assessment process. Construct a workflow that includes these key players. Clearly define the role they play and what the expectations are for their involvement in the assessment workflow.
  4. Allow for transparency
    • Through the use of workflows, AEFIS makes it possible for key players to see the evolution of an assessment report at all stages of the process, not just at the very end. Embedded feedback allows users to see the perspective of others involved in the process, making assessment a collaborative effort.
  5. Be flexible!

The assessment process is not necessarily linear. Assessment plans may change mid-cycle, or multiple iterations of feedback may be necessary. The flexible functionality of AEFIS allows users to effectively address these issues, including the ability to send forms backward through the workflow and making fields editable at various workflow stages. Use these features to emphasize quality assessment practices.

Related Articles

Messiah University Does It Right!: Earning Excellence in Assessment Designation

Messiah University’s Excellence in Assessment designation hardly seemed possible when NILOA first announced the program. Our assessment office came into existence ten years ago coinciding with the institution’s preparation for a Middle States’ self-study. When we started, assessment was a “dirty word” for many of our faculty. In their minds, assessment was linked to faculty evaluation. The Excellence in Assessment designation validates the challenging conversations and work of our campus over the past ten years. Messiah moved from a culture of grudging compliance to a culture focused on examining authentic student performance to drive conversations about improvements in student learning.

Reflections on Trust: Cheating in a Pandemic

n August 2020, the National Institute for Learning Outcomes Assessment (NILOA) released survey findings examining assessment-related changes made in the Spring 2020 semester. As part of that report, we examined the intersections of those findings with other COVID-19 related reports on the shift to remote instruction and students’ experience of teaching and learning during that time, offering considerations for practice in the fall and beyond. While August was only a month ago, in pandemic time it feels like at least 8 months have passed since the report was released. Based on conversations with assessment professionals and questions raised about learning in the fall, the report remains relevant all this time later.

The Unexpected Parallels Between Art and Assessment

This past January, I was lucky (or worked hard enough) to receive a research grant from Thomas Jefferson University in Philadelphia, PA. Until this point, most of my university-funded research focused on assessment and revolved around conference presentations. This, however, was a Project Completion Grant, and the project was not about assessment, but part of my professional practice as an artist.

Discussion

Join the conversation, contribute, ask questions, and explore with everyone! It’s Your Community!