EMERGING DIALOGUES IN ASSESSMENT

Chemists’ Assessment Experiment: A Success Story in Faculty Professional Development to Enhance Student Achievement in STEM

May 8, 2025

  • Amy J. Heston, Ph.D., Leader, Center of Academic and Professional Enrichment Academic Excellence Pillar, Chair, University Program Assessment Committee, Professor of Inorganic Chemistry, Walsh University
  • Neil G. Walsh, Ph.D., Chair, Division of Mathematics and Sciences, Associate Professor of Chemistry, Walsh University

 

Assessment of student learning is a critical element of higher education accreditation. It provides valuable insights into student achievement of learning outcomes, effectiveness of programs and experiences, and institutional performance. However, assessment can be somewhat intimidating for faculty members, particularly if there is no clear vision for how it impacts the overall student body. This article describes a successful faculty training initiative designed by two chemists at Walsh University, a mid-sized liberal arts institution. With a vision toward continuous improvement, this collaboration addresses the challenges associated with assessment data collection and analysis in STEM programs. Moreover, this partnership emphasizes how professional development in assessment can make a positive impact on learner achievement and student success.

Our Challenge

The STEM disciplines associated with this work were biology, chemistry, and computer science. The faculty members faced several challenges related to assessment. Professors were often overwhelmed by the complexity of the assessment process and unsure of how to effectively collect, analyze, and interpret data. This lack of clarity led to inconsistencies in assessment practices and hindered the program's ability to make data-driven decisions. Other research groups have noted similar challenges and explored options for faculty professional development initiatives (Biswas et al., 2022). To improve this situation, two chemistry faculty members initiated a collaborative effort to address these issues. They created a carefully designed training to include everyone’s comfort levels and discipline-specific needs. The success of this overall process aligned with other research groups who shared advancements in their efforts to ensure fairness in STEM practices (Lucietto et al., 2024). Moreover, this project revealed opportunities for faculty to align their practices with high-impact practices (HIPs) (Kuh, 2008). Some HIPs assessed within this initiative are described later in this paper. Essentially, these efforts were critical aspects to the success of our initiative called the chemists’ “assessment experiment.”

Chemists’ Solution: A Two-Part Faculty Professional Development

To explore these challenges, the two chemists envisioned and designed a two-part faculty training initiative. The first part of the training focused on providing faculty with a clear understanding of the purpose and importance of assessment in STEM. Through the lens of a scientist, participants were introduced to the concept of Integrated Institutional Effectiveness (IIE) and how assessment can contribute to achieving institutional goals (Hoshaw et al., 2021). The training also emphasized the connection between assessment and improving student success.

The second part of the training provided faculty with a step-by-step procedure for completing the annual program assessment report. This procedure broke down the process into manageable tasks, such as identifying assessment methods, collecting data, analyzing results, and developing action plans. Diving a bit deeper into assessment practice, the chemists explained how their efforts were essential and important for upcoming reports, particularly the annual program reviews for science programs. In addition, the training highlighted examples of effective assessment practices in biology, chemistry, and computer science as well as strategies for overcoming common challenges. Past examples served as guides for the upcoming year while allowing for faculty voice and choice when selecting assessment methods. Moreover, the previously established assessment plan served as an “assessment roadmap” to guide faculty on the current assessment cycle and clarified which program student learning outcomes were to be evaluated that year. Updating the previously designed project site on the learning management system served as another way to streamline the process for faculty. The site was refreshed to include updated evaluation rubrics and assessment plans. The Dropbox allowed for personal archives of course-level assessment data, but it also told an “assessment story.” Collectively, this information was key for identifying learning trends within biology, chemistry, and computer science. 

Results and Impact

The faculty training initiative had a significant impact on the science program's assessment practices. Faculty members reported a greater understanding of assessment and its role in improving student learning. Additionally, they expressed increased confidence in their ability to complete the annual program assessment report and its overall importance to the divisional goals. As a result, the quality and consistency of assessment data improved, allowing the programs of biology, chemistry, and computer science to make more informed decisions about curriculum, instruction, and strategies for student support.

Broader View for Advancing Assessment

This faculty training initiative has several implications for the broader conversation on student learning assessment. First, it demonstrates the importance of providing faculty with science-specific guidance and supporting effective implementation of assessment practices. By breaking down complex tasks into manageable steps, the professional development initiative helped to reduce the faculty’s feeling of intimidation toward assessment methods. This, in turn, led to an increase in faculty confidence and efficiency. Second, the initiative highlights the need for institutions to prioritize faculty development in assessment. By investing in STEM-specific training programs, institutions can ensure that science faculty members, in particular, have the skills and knowledge necessary to conduct effective assessments. Collectively, this initiative aligned with research studies that reported the direct connection of assessment, evaluation, and research to evidence improvement to support institutional effectiveness (Hoshaw et al., 2021)

Promoting Cultural Humility in Assessment

Analyzing the training in a different way, the "assessment experiment" allowed for cultural humility in assessment strategies. By recognizing the various backgrounds and experiences of students, faculty members were able to develop assessments that were designed for all learners. For example, our institution created multiple opportunities to establish close relationships with the students such as smaller class sizes (approx. < 20 students), undergraduate research, two-year Honors projects, collaborative learning, and club activities. Through this close relationship, faculty tailored learning experiences and created assessments that would be effective for all students. This strategy, along with a faculty focus on continuous improvement each year, supported fairness in assessment strategies. As a result, the assessment strategies were implemented across all the divisional disciplines (computer science, biology, and chemistry). This multidiscipline approach aligns with work done by other higher education professionals (Lucietto et al., 2024). Additionally, the assessment training equipped faculty with new insights and strategies that reduced bias. STEM, in particular, has been emphasized as an important field to strengthen in terms of cultural competency and fairness (Lucietto et al., 2024). More specifically, the stepwise procedure and checklist ensured that assessment practices were fair and unbiased. By utilizing the division’s 3-point assessment rubric, the approaches to learner evaluation were consistent and allowed for uniformity across biology, chemistry, and computer science. More specifically, fairness in the assessment process across multi-disciplinary faculty was ensured through similar data collection procedures, identical evaluation scale, assessment  through measurable outcomes that included Blooms action words, and recording of data on the annual program assessment report template. One hidden gem from this assessment protocol was the alignment of cultural humility within the assessment process. By looking at assessment processes through a cultural humility lens, the faculty in the Division of Mathematics and Science at Walsh University were able to complete the evaluations in a unified manner yet allowed for flexibility in assessment methods to meet the needs of their students.

Assessment of High-Impact Practices (HIPs)

By streamlining the assessment process and concentrating on targeted outcomes, faculty gained a deeper understanding of student learning in HIPs such as collaborative projects and undergraduate research (Kuh, 2008). In addition, this approach aligned with institutional goals to include HIPs in campus-wide program assessment, thereby connecting to goals within the Division of Mathematics and Science. Some examples included evaluating written communication for Honor theses, demonstrating appropriate lab techniques in undergraduate research, and collaborating effectively in group activities in both classroom and laboratory settings.

Practical Implications and Implementation Strategies

The findings from this case study showed several practical implications for readers interested in improving their own assessment practices. These viewpoints can be applied to any discipline in higher education. First, readers can consider developing a similar faculty training program to enhance their colleagues’ knowledge and application of assessment. Second, the step-by-step procedure can be created to serve as a faculty guide for completing their own assessment reports, including action items. Next, instructors can explore ways to incorporate cultural humility in their assessment practices at the course and program level. Finally, the importance of aligning assessment to HIPs can be shared and incorporated this academic year.

Conclusion

The faculty training initiative described in this article represents a valuable contribution to the field of higher education assessment. By streamlining the assessment process and promoting cultural humility, the initiative has improved the quality of student learning and enhanced the overall effectiveness of the STEM program at Walsh University. As institutions continue to seek ways to improve their assessment practices, the lessons learned from this case study can provide practical insights and encouragement for implementation.

Future Work

To further build upon the success of the "assessment experiment," the assessment coordinator will continue to track the impact of the action items proposed by faculty members. By analyzing the data collected through these initiatives, the coordinator can assess the effectiveness of the faculty training program and identify further areas of improvement.

Additionally, the coordinator will promote more opportunities for collaborative work in assessment strategies, including the assessment of high-impact practices such as undergraduate research, collaborative assignments, internships, and common intellectual experiences. By working together, faculty members can develop more comprehensive and effective assessment reports.

 


 

References

Biswas, S., Benabentos, R., Brewe, E., Potvin, G., Edward, J., Kravec, M., & Kramer, L. (2022). Institutionalizing evidence-based STEM reform through faculty professional development and support structures. International Journal of STEM Education, 9(1), 36. 

Hoshaw, J., Isaacson, E. M., Grabau, A., Wilkinson, R., Di Genova, L., Schramm-Possinger, M., Santilli, N. R., Daugherty, K. K., & Ben-Avie, M. (2021). Integrated planning: The “Difference That Makes a Difference” in institutional effectiveness over time. Intersection: A Journal at the Intersection of Assessment and Learning, 2(3). 

Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. AAC&U.

Lucietto, A., & Ma, T. (2024). Cross-cultural communication across STEM disciplines. Frontiers in Education, 9, 14 May 2024. DOI 10.3389/feduc.2024.1373558.