- Home
- About AALHE
- Board of Directors
- Committees
- Guiding Documents
- Legal Information
- Organizational Chart
- Our Institutional Partners
- Membership Benefits
- Member Spotlight
- Contact Us
- Member Home
- Symposium
- Annual Conference
- Resources
- Publications
- Donate
EMERGING DIALOGUES IN ASSESSMENTBeyond Herding Cats: Building a Sustainable General Education Assessment Plan
March 25, 2026
AbstractThis article examines the development of a sustainable general education assessment plan through a faculty-led, inquiry-driven process. Drawing on the experience of a General Education Committee, the work illustrates how structured inquiry, collaborative revision, and iterative design can transform fragmented assessment efforts into a coherent and actionable framework. By organizing assessment across student, course, and program levels, and by grounding the process in guiding questions, the committee developed a dynamic system that supports both accountability and continuous improvement. The article highlights how the process of building the plan functioned as a form of assessment itself, surfacing institutional gaps, clarifying responsibilities, and informing administrative decision-making. This case demonstrates that assessment, when approached as collaborative sensemaking rather than compliance, can strengthen both academic practice and institutional structure. Introduction The phrase “herding cats” first entered my life years ago during a neurologist appointment. When he asked about my profession, and I explained that I was an Education Director at Head Start, he laughed and replied, “So you herd cats.” I had never heard the expression before. Later, after discovering the now-famous video of cowboys attempting to herd cats across the prairie, I realized the metaphor captured something remarkably true about leading complex educational work. Years later, after being elected chair of the General Education Committee (GEC), the phrase returned immediately to mind. The experience that followed illustrates how a faculty-led committee can move from uncertainty to a sustainable assessment process through structured inquiry and collaborative problem solving. There was a great deal to accomplish. This was the year to finally focus on assessment of the “new” general education program. Implementation began in 2019, as did the assessment process. Efforts were called to a halt by the provost in light of the rapid shift to work remotely due to the onset of the COVID-19 pandemic. Policies and an assessment timeline had been established, and artifact collection had started. However, there was no definitive assessment plan. Attempts to discuss designing a plan as a whole committee proved daunting and time-consuming. Developing a sustainable assessment plan required designing a format, guiding inquiry, and creating an overall document ready for committee discussion, collaborative revision, and approval. Format DesignDeveloping a sustainable assessment plan required first establishing a clear and functional format to guide the work. Without an organizing structure, discussions remained fragmented, and the relationship among assessment components was difficult to see. The format served as both a conceptual and practical tool, allowing the committee to align questions, evidence, and responsibilities across student, course, and program levels in a coherent and actionable way. Building a Corral The GEC previously spent a year establishing assessment practices, but in the absence of a plan, the larger picture remained unclear. Confusion was plentiful. It was evident that a plan was necessary. Yet, the Faculty Senate had outlined 13 charges for the GEC; time was of the essence. Corralling began by gathering existing university general education assessment policies, which provided the foundation upon which to build. The framework needed to address assessment at three levels: student, course, and program. A table was created to organize the assessment framework. Column headings included the source of information, evidence to be collected, how the evidence demonstrates the outcome, methods of assessment, methods of collection, methods of analysis, required resources, and timeline expectations. Guiding questions were then drafted for each row across the three levels of assessment: student, course, and program. Each cell was completed at the intersection of these rows and columns to clarify responsibilities and data sources. For example, the source column for the question “How many class sections were offered in each component area?” was identified as administration. Guiding InquiryAn initial draft of the general education assessment plan table was developed by the committee chair and shared with the General Education Committee (GEC) for discussion and refinement. The committee then began evaluating and revising the guiding questions to better support meaningful assessment. At first, this process proved challenging. However, changing the order in which the levels of assessment were considered made a significant difference. Beginning with student-level questions felt too personal and evaluative, which slowed the discussion. Starting instead with the course level provided a more objective foundation for examining how learning outcomes were addressed in the curriculum. With this shift, the GEC gained momentum in reviewing and refining the questions. The process continued throughout the winter quarter, and by the end of the term, the committee had finalized assessment questions for each level. Course Level Questions
Student Level Questions
Program Level Questions (Framework)
Program Level Questions (Learning Approaches & Higher Education Core Competencies)
Committee DiscussionEach week of the winter quarter, the assessment plan was a substantial agenda item. By the end of each meeting, there were more questions about assessment than answers about the plan itself. Most questions required research and organization of the information found and/or developed. The process of creating an assessment plan furthered inquiry and discussion of the plan’s format, the responsibilities of assessment, the timeline, the nature of assessment, the intent of assessment at each level, the relationship between instructors and reviewers, and what re-review of courses entailed. Simply through discussing the draft plan, the purpose and practice of assessment began to take clearer shape and hold deeper meaning, prompting recognition of how assessment can inform decision-making. Level by level, part by part, the general education assessment plan was discussed, revised, improved, and approved. Soon, there was an awareness that this was just the beginning; actually working the plan would bring the document to life. The GEC discovered the need for a re-review rubric. There also needed to be a systematic way to request assessment information from instructors by class and to collect that information in a repository accessible to the instructor and the GEC re-reviewers to ease the workload for all. Information was retrieved from the original proposal for each general education course to create a course response template. This approach saved instructor time while maintaining accuracy. Instructors add a rubric to their Canvas class to assess student performance. In addition, a Canvas “class” was created specifically for general education assessment. Instructor response templates were uploaded for each general education component area, see Figure 1. Each general education instructor for the courses to be assessed was added to the Canvas course as “students”. Once instructors completed the template, they submitted it in Canvas as an assignment, then uploaded the class syllabus and related artifacts. Instructors complete the blank areas. Preloaded information saves time while maintaining academic accuracy. Figure 1: General Education Instructor Assessment Feedback Form: Culminating Experience
Once instructors completed the template, they submitted it in Canvas as an assignment, then uploaded the class syllabus and related artifacts. Each GEC member was added to the course as a TA in order to conduct the re-review. The re-review rubric was added to the Canvas course as well, see Figure 2. The rubric scores were then entered as grades. Figure 2: General Education Re-review Rubric
The re-review rubric has five levels pertaining to the integration of general education learner outcomes. They include: absent (from the syllabus); approaching (partially present and/or aligned in the syllabus); advancing (present and aligned in the syllabus); accomplished (present, well aligned, and integrated with activities in the syllabus); and exemplary (present, well aligned, integrated with activities in the syllabus, with artifact(s) provided, and/or program goals addressed). ConclusionThe GEC was successful in developing an assessment plan that included a clear format, guiding inquiry, and an overall document ready for continued committee discussion, collaborative revision, and approval. The result was a sustainable and dynamic assessment plan capable of guiding ongoing review of the general education program. The process of developing the plan also functioned as an assessment activity in itself, generating insights about how the program was organized and administered. Through this work, the GEC was able to better articulate several administrative inconsistencies within the general education program. For example, the absence of administrative oversight by a dean meant that certain program responsibilities had gone largely unattended. Identifying these gaps demonstrated how assessment data can illuminate structural issues and inform institutional decision-making. In response to these findings, the university created a position for a dean of undergraduate studies, giving the general education program a clear administrative home. This experience demonstrated the power of assessment not simply as a mechanism for evaluating student learning, but as a tool for strengthening institutional structures and supporting informed decision-making. When approached collaboratively, assessment can clarify responsibilities, highlight overlooked challenges, and guide meaningful program improvement. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||