EMERGING DIALOGUES IN ASSESSMENT

Beyond Herding Cats: Building a Sustainable General Education Assessment Plan 

 

March 25, 2026

  • Teresa Walker
    Professor and Department Chair (Education, Development, Teaching, and Learning)
    Central Washington University

Abstract

This article examines the development of a sustainable general education assessment plan through a faculty-led, inquiry-driven process. Drawing on the experience of a General Education Committee, the work illustrates how structured inquiry, collaborative revision, and iterative design can transform fragmented assessment efforts into a coherent and actionable framework. By organizing assessment across student, course, and program levels, and by grounding the process in guiding questions, the committee developed a dynamic system that supports both accountability and continuous improvement. The article highlights how the process of building the plan functioned as a form of assessment itself, surfacing institutional gaps, clarifying responsibilities, and informing administrative decision-making. This case demonstrates that assessment, when approached as collaborative sensemaking rather than compliance, can strengthen both academic practice and institutional structure.

Introduction

The phrase “herding cats” first entered my life years ago during a neurologist appointment. When he asked about my profession, and I explained that I was an Education Director at Head Start, he laughed and replied, “So you herd cats.” I had never heard the expression before. Later, after discovering the now-famous video of cowboys attempting to herd cats across the prairie, I realized the metaphor captured something remarkably true about leading complex educational work.

Years later, after being elected chair of the General Education Committee (GEC), the phrase returned immediately to mind. The experience that followed illustrates how a faculty-led committee can move from uncertainty to a sustainable assessment process through structured inquiry and collaborative problem solving. There was a great deal to accomplish. This was the year to finally focus on assessment of the “new” general education program. Implementation began in 2019, as did the assessment process. Efforts were called to a halt by the provost in light of the rapid shift to work remotely due to the onset of the COVID-19 pandemic. Policies and an assessment timeline had been established, and artifact collection had started. However, there was no definitive assessment plan. Attempts to discuss designing a plan as a whole committee proved daunting and time-consuming. Developing a sustainable assessment plan required designing a format, guiding inquiry, and creating an overall document ready for committee discussion, collaborative revision, and approval.

Format Design

Developing a sustainable assessment plan required first establishing a clear and functional format to guide the work. Without an organizing structure, discussions remained fragmented, and the relationship among assessment components was difficult to see. The format served as both a conceptual and practical tool, allowing the committee to align questions, evidence, and responsibilities across student, course, and program levels in a coherent and actionable way.

Building a Corral

The GEC previously spent a year establishing assessment practices, but in the absence of a plan, the larger picture remained unclear. Confusion was plentiful. It was evident that a plan was necessary. Yet, the Faculty Senate had outlined 13 charges for the GEC; time was of the essence. Corralling began by gathering existing university general education assessment policies, which provided the foundation upon which to build. The framework needed to address assessment at three levels: student, course, and program. A table was created to organize the assessment framework. Column headings included the source of information, evidence to be collected, how the evidence demonstrates the outcome, methods of assessment, methods of collection, methods of analysis, required resources, and timeline expectations. Guiding questions were then drafted for each row across the three levels of assessment: student, course, and program. Each cell was completed at the intersection of these rows and columns to clarify responsibilities and data sources. For example, the source column for the question “How many class sections were offered in each component area?” was identified as administration.

Guiding Inquiry

An initial draft of the general education assessment plan table was developed by the committee chair and shared with the General Education Committee (GEC) for discussion and refinement. The committee then began evaluating and revising the guiding questions to better support meaningful assessment. At first, this process proved challenging. However, changing the order in which the levels of assessment were considered made a significant difference. Beginning with student-level questions felt too personal and evaluative, which slowed the discussion. Starting instead with the course level provided a more objective foundation for examining how learning outcomes were addressed in the curriculum. With this shift, the GEC gained momentum in reviewing and refining the questions. The process continued throughout the winter quarter, and by the end of the term, the committee had finalized assessment questions for each level.

Course Level Questions

  • How many class sections were offered in each component area?
  • Where were classes offered by campus location?
  • How were classes offered by modality?
  • What classes were offered through College in the High School?
  • What days/times did the classes meet?
  • How many students were enrolled?
  • How many students did not earn a letter grade?
  • Are enrollment capacity limits being honored?
  • Who were the instructors by position title?
  • How were the GE component area learner outcomes aligned with course learner outcomes and activities?
  • How well did the GE component area learner outcomes appear to be addressed in classes?

Student Level Questions

  • Did individual students demonstrate attainment of each General Education learner outcome?
  • How many students met or exceeded the criteria for each component area learner outcome?
  • How do GE students perceive their GE experience?
  • What do students think about the GE program?

Program Level Questions (Framework)

  • How can GE assessment benefit departments?
  • Where are the General Education program goals being addressed through learner outcomes?
  • How are the General Education program goals being promoted through courses?
  • What does the faculty value in the assessment of GE?

Program Level Questions (Learning Approaches & Higher Education Core Competencies)

  • How is the General Education program promoting Liberal Arts?
  • How does the GE program promote liberal education?
  • How is the GE program aligned with AAC&U Value Rubrics?
  • What are the High Impact Practices incorporated by design within the General Education program?
  • How is the intent of High Impact Practices demonstrated?
  • What High Impact Practices are identified in syllabi?
  • Where is Signature Work created within the General Education program?
  • How does GE address core competencies?

Committee Discussion

Each week of the winter quarter, the assessment plan was a substantial agenda item. By the end of each meeting, there were more questions about assessment than answers about the plan itself. Most questions required research and organization of the information found and/or developed. The process of creating an assessment plan furthered inquiry and discussion of the plan’s format, the responsibilities of assessment, the timeline, the nature of assessment, the intent of assessment at each level, the relationship between instructors and reviewers, and what re-review of courses entailed. Simply through discussing the draft plan, the purpose and practice of assessment began to take clearer shape and hold deeper meaning, prompting recognition of how assessment can inform decision-making.

Level by level, part by part, the general education assessment plan was discussed, revised, improved, and approved. Soon, there was an awareness that this was just the beginning; actually working the plan would bring the document to life. The GEC discovered the need for a re-review rubric. There also needed to be a systematic way to request assessment information from instructors by class and to collect that information in a repository accessible to the instructor and the GEC re-reviewers to ease the workload for all.

Information was retrieved from the original proposal for each general education course to create a course response template. This approach saved instructor time while maintaining accuracy. Instructors add a rubric to their Canvas class to assess student performance. In addition, a Canvas “class” was created specifically for general education assessment. Instructor response templates were uploaded for each general education component area, see Figure 1. Each general education instructor for the courses to be assessed was added to the Canvas course as “students”.

Once instructors completed the template, they submitted it in Canvas as an assignment, then uploaded the class syllabus and related artifacts. Instructors complete the blank areas. Preloaded information saves time while maintaining academic accuracy.

Figure 1: General Education Instructor Assessment Feedback Form: Culminating Experience

Course Information Prefix & #:

EDEC 432

Title:

Theories in Child Development

Credits: 3 AY: 2025-26
Course Description Assists the student in formulating his or her own general assumptions about the nature of child development through study of various theoretical viewpoints and current issues. CE-4
General Education Alignment Summary This course meets the general education goals and philosophy by exploring critical thinking, reflection, and analysis of theorists that are the cornerstone of early childhood education.
Course Information Course
Learner Outcomes
Assessments #1 Activities #2 GE Learner Outcomes

(copy/paste entire learner outcome)

  CLO1. Identify competing theoretical perspectives of developmental influences, particularly as they relate to the influences of nature and nurture. Weekly assignments, quizzes, Piaget Conservation    
  CLO2. Interpret theoretical information through research of various early childhood theories. Weekly assignments, quizzes, and Reflection, recorded responses to text    
  CLO3. Identify theories within case studies. Weekly assignments, quizzes, and Reflection, recorded responses to text    
  CLO4. Describe physical, cognitive, and socio-emotional development from conception through adolescence, citing relevant theory and research. Final Project    
  CLO5. Discuss the role of genetics and hereditary factors (including maturation) in the cognitive, behavioral, and psychosocial development of children. Final Project    
#3 Suggested Considerations/ Recommendations Student Success This Course This Component Area The General Education Program
       
Culminating Experience (CE) learner outcomes (to copy & paste above):

CE-1. Demonstrate clear communication strategies and techniques in oral, written, or expressive form.
CE-2. Apply higher-order critical thinking and/or problem-solving skills.
CE-3. Reflect upon, integrate, and apply the knowledge and skills they gleaned from their undergraduate experience, including General Education.
CE-4 Synthesize and present a response, propose a solution/answer, or showcase their own creative work.

 

Once instructors completed the template, they submitted it in Canvas as an assignment, then uploaded the class syllabus and related artifacts. Each GEC member was added to the course as a TA in order to conduct the re-review. The re-review rubric was added to the Canvas course as well, see Figure 2. The rubric scores were then entered as grades.

Figure 2: General Education Re-review Rubric

Absent

Integration of GE
Learner Outcomes

Recommend 
Continuation 
with Follow-up 
1

Approaching

Integration of GE
Learner Outcomes

Recommend 
Continuation 
with Support 
2

Advancing
Integration of GE
Learner Outcomes

Recommend
Continuation 

3

Accomplished

Integration of GE
Learner Outcomes

Recommend
Continuation

4

Exemplary

Integration of GE
Learner Outcomes

Recommend
Continuation
5

GE program component area learner outcomes are: GE program component area learner outcomes are: GE program component area learner outcomes are: GE program component area learner outcomes are: GE program component area learner outcomes are:
- absent from the syllabus.

- partially present and/or
- aligned in the syllabus.

- present and

- aligned in the syllabus.

- present,

- well aligned, and
- integrated with activities in the syllabus.

- present,
- well aligned,
- integrated with activities in the syllabus
and
- artifact provided, and/or
- program goals addressed.

GEC will provide support and follow-up to ensure progress to integrate GE learner outcomes (per policy). GEC will offer support to more fully integrate GE learner outcomes. GEC will suggest alignment with activities to fully integrate GE learner outcomes. No specific action by the GEC expected. GEC may reference for examples and request permission from instructor  to share
 GE Component Area  GE Goals  College/Dept  Class Reviewed Score

Recommendation

         

 

The re-review rubric has five levels pertaining to the integration of general education learner outcomes. They include: absent (from the syllabus); approaching (partially present and/or aligned in the syllabus); advancing (present and aligned in the syllabus); accomplished (present, well aligned, and integrated with activities in the syllabus); and exemplary (present, well aligned, integrated with activities in the syllabus, with artifact(s) provided, and/or program goals addressed).

Conclusion

The GEC was successful in developing an assessment plan that included a clear format, guiding inquiry, and an overall document ready for continued committee discussion, collaborative revision, and approval. The result was a sustainable and dynamic assessment plan capable of guiding ongoing review of the general education program. The process of developing the plan also functioned as an assessment activity in itself, generating insights about how the program was organized and administered.

Through this work, the GEC was able to better articulate several administrative inconsistencies within the general education program. For example, the absence of administrative oversight by a dean meant that certain program responsibilities had gone largely unattended. Identifying these gaps demonstrated how assessment data can illuminate structural issues and inform institutional decision-making. In response to these findings, the university created a position for a dean of undergraduate studies, giving the general education program a clear administrative home.

This experience demonstrated the power of assessment not simply as a mechanism for evaluating student learning, but as a tool for strengthening institutional structures and supporting informed decision-making. When approached collaboratively, assessment can clarify responsibilities, highlight overlooked challenges, and guide meaningful program improvement.