- Home
- About AALHE
- Board of Directors
- Committees
- Guiding Documents
- Legal Information
- Our Institutional Partners
- Membership Benefits
- Contact Us
- Member Home
- Annual Conference
- Resources
- Publications
- Social Media
- Donate
EMERGING DIALOGUES IN ASSESSMENTInnovative Approaches to Programmatic Assessment in an Era of Flux
July 31, 2020 This article will be part of an AALHE Publications Special Edition, Assessment: Perseverance During a Pandemic, to be released in August. Leaders of higher education need practical solutions to engaging in program improvement, providing student supports, and fortifying partnerships with collaborators. Compounding these needs are the heightened demands thrust upon faculty to advise students online, reconfigure in-person courses to hybrid and online models, and attend to their students’ emergent psychological, academic and social needs – as they themselves navigate through a new set of personal stressors associated with COVID-19. Making matters worse is that many of these changes have been and will be unpredictable; a shift from in-person to hybrid instruction can occur in less than a week. Members of the academy are faced with one paradoxical certainty: we are in an era of flux that is unlikely to change anytime soon. At Winthrop University, directors of teacher education programs are charged with writing and submitting a “Continuous Improvement Plan” (CIP) report at the end of each academic year. The practice is intended to ensure that program faculty are continuously engaging in data-informed improvement. As the director of a physical education teacher education (PETE) program at Winthrop University, I lead the process of collecting data for the CIP. The co-author of this paper is a Senior Research Associate in The Department of Accreditation, Accountability, and Academic Services (AAAS) at Winthrop, who updates faculty on accreditation standards, assessment methods, and corresponding practices, so they are supported in engaging in meaningful programmatic evaluation to inform their strategies for continuous improvement. My colleague in AAAS and I work as a team in our respective roles to share ideas, rethink evaluation methodologies, and evaluate results so they can be translated into practice. Until recently, most of this collaboration and data collection occurred during face-to-face faculty meetings, partnership committee gatherings, advising visits, and on-campus office hours. However, as we all know, in-person communication methods were and will continue to be challenging to conduct because of COVID-19. Despite this, meaningful assessment and the practices used to support it can be maintained virtually without exerting feats of herculean strength. Pre COVID-19 Data Collection Prior to campus closures, my colleagues and I held meetings with community partners and students that enabled us to gather important feedback for the PETE program’s CIP. Specifically, we hold a Partnership Advisory Council (PAC) meeting, attended by our community stakeholders and the PETE faculty members. These stakeholders include alumni and community partners who serve as mentors to our interns and practicum teacher candidates. Additionally, we lead Physical Education Majors (PEM) club meetings throughout the semester, which allow faculty to show students that we care for and are invested in their academic and social well-being. The club also helps our PETE majors foster and maintain social networks. I planned to request feedback from students on how to improve the club and the PETE program during our final face-to-face PEM meeting. Then the administration announced that nearly all university functions, including the PAC meetings, would transition to remote delivery for the remainder of the semester. Truthfully, I was uncertain whether remote meetings with stakeholders and students would be effective in maintaining our accreditation requirements. Many of our colleagues expressed their uncertainties when posing questions such as, “What will we tell the accreditor if we can’t collect the data?” Fortunately, we found practical solutions. Use Available Resources in Your Institution for Accreditation and Goal Achievement As noted above, program directors are supported in the assessment process by the Office of Accreditation, Accountability, and Academic Services (AAAS). Thus, I sought advice from the co-author of this paper, who works in AAAS, on how to collect data for my CIP. She recommended creating short surveys that contained both open-ended and scaled questions, which could be administered remotely to all stakeholders and students. This approach allowed me to collect targeted quantitative and qualitative data, whereas the face-to-face approach only provided qualitative data. My colleague in AAAS also guided me in how to meaningfully use data obtained from stakeholder surveys and meeting minutes as artifacts to inform our assessments for multiple accreditation reports, without having to exert any more time than we would have under normal circumstances. These processes are congruent with CAEP Component 5.5, which states that stakeholders and community partners “are involved in program evaluation, improvement, and identification of models of excellence” (CAEP, 2020, p.1). They are also relevant to our Specialized Professional Association (SPA) accreditation report, primarily Section 5, which invites faculty to discuss their interpretations of data findings and overall approach to program betterment. We used our institution’s Qualtrics system to administer surveys to members of the PAC and the PEM Club. The PAC survey asked stakeholders to rate their level of agreement to statements regarding the teacher candidates’ and PETE program’s performance in multiple areas on a five-point scale. The PEM Club survey’s statements pertained to teacher candidates’ level of agreement that the PETE program was invested in their learning, effective at helping them establish social connections, and supportive of them staying within the major. Both measures invited respondents to provide open-ended feedback and suggestions. We developed a report of survey respondents’ feedback to serve as an artifact for the CIP and to further promote an evidence-based approach to program improvement. Making Stakeholder Meetings Engaging The PEM Club still met virtually. During these meetings, PETE faculty and students competed to answer Jeopardy-style specialization questions. We also conducted informal summer and fall advisement and held an open topic discussion time. Open discussions allowed for faculty and students’ peers to better understand and support one another and appreciate each student’s unique contexts. These discussions not only advanced our CIP for regional accreditation but they are also consonant with Standard 3, Council for the Accreditation of Educator Preparation (CAEP) Component 3.1, which states, “The provider presents plans and goals to recruit and support completion of high-quality candidates from a broad range of backgrounds and diverse populations to accomplish their mission” (CAEP, 2020, p.1). Prior to hosting the virtual PAC meeting, the PETE faculty analyzed the findings from the PAC survey. Based on these findings, the PETE faculty members developed proposals to address areas within the PETE program that could be improved. These proposals included, but were not limited to, the development of a new course, additions to existing course content, and providing PETE program teacher candidates with targeted professional development opportunities. The PETE faculty then held a follow-up meeting online to present plans for improvement to PAC members and to collect information from them about our proposal. Take Action to Achieve Program Goals and Maintain Accreditation The surveys we developed in response to the new environment of remote collaboration provided invaluable data, which will inform programmatic improvements, meet our accreditation needs, and advance the relationship building that is foundational to our program. One improvement informed by feedback from the PAC meeting with stakeholders was the creation of a new course, “Professionalism in Physical Education,” which is devoted solely to teacher candidates’ professional development. The PAC member survey feedback indicated that teacher candidates would benefit from more time in the field to improve their professional development. This suggestion was in reference to SHAPE America Standard 6, “Professional Responsibility.” As part of the course requirements, teacher candidates will spend time serving the community by assuming leadership roles in local education organizations. Furthermore, we will pilot a new key assessment within this course in the fall and use data for future SPA reporting. Final Thoughts Throughout this process we discovered that the sudden disappearance of face-to-face meetings did not interrupt our programmatic improvement. Contrarily, we strengthened our approach to improvement by increasing scholarly internal collaboration, and implementing technological resources, which provided tangible data used to drive this decision-making process. While doing so, we have been able to strengthen our relationships with students and stakeholders. We care about students, want to improve our programs, and want to engage in the kinds of reflection and analysis that led us into higher education. Continual improvement can be accomplished without excessive stress even in a state of flux. References Council for the Accreditation of Educator Preparation. (2020, May 28). Standard 5: Provider quality, continuous improvement, and capacity. caepnet.org. http://caepnet.org/standards/standard-5 SHAPE America - Society for Health and Physical Educators. (2017). 2017 National Standards for Initial Physical Education Teacher Education. SHAPEAmerica.org. https://www.shapeamerica.org/accreditation/upload/2017-SHAPE-America-Initial-PETE-Standards-and-Components.pdf |