Session Descriptions

Jump to:

Pre-Recorded Sessions Session Posters
Day 1 Sessions Day 2 Sessions

PRE-RECORDED SESSIONS:

The State of Equity in Learning Outcomes: Results from a National Survey of Assessment Leaders

Kelly Long, Assistant Professor, University of North Georgia
Katlynn Fisher, Title III Data Analyst, Harrisburg Area Community College
Rachel Lathrop, Director of Assessment, Accreditation & Quality, Oakland Community College
Despite years of guidance on equity-centered assessment, little national data exists on whether higher education institutions are putting those principles into practice. This session presents findings from a recent national survey of 100 assessment leaders at U.S. public institutions examining perceptions, finding that half of assessment leaders (56%) view DEI as very important, though these leaders estimated that just 20% of faculty and 35% of administrators would agree. Equity focused learning outcomes and assessment methods were reported as infrequent, as were best practices associated with interpreting and disseminating the data. Concerns were raised by about a quarter of respondents relating to voices being marginalized during the assessment process, less than half reported seeking student input on learning outcomes, and half the respondents reported some DEI rollbacks, realizing the benefits of DEI in assessment remains tenuous.

From Principle to Practice: A Thematic Analysis of Strategies for Culturally-Responsive and Equitable Assessment

Bhargav Kaushik, Graduate Research Assistant, Assessment and Curriculum Support Center,  University of Hawaiʻi at Mānoa
Yao Hill, Specialist (Professor-Rank faculty), University of Hawaiʻi at Mānoa

Culturally Responsive and Equity-Minded Assessment (CEA) practices directly support the University of Hawai‘i at Mānoa’s mission and strategic plan. To understand current practice and identify areas for growth, we collected data from approximately 240 academic degree programs on their implementation of ten recommended CEA principles. This process yielded more than 400 strategies described by 153 programs. This presentation will share the process and findings from the thematic analysis of these strategies and illustrate how the results informed a campus-wide capacity-building agenda, including showcase events, practical guides, and professional development activities, to advance CEA practices across UH Mānoa.

Advancing Equity in Student Learning Assessment: Culturally Responsive Practices for Indigenous Student Success

Jessica L. W. Miranda, Executive Director of Strategic Directions, Assessment, and Accreditation, University of Hawaiʻi–West Oʻahu
This session examines culturally responsive student learning assessment practices grounded in Indigenous ways of knowing, drawing on work conducted at two institutions serving Indigenous communities in Hawai‘i and Brazil. Through examples such as oral and narrative demonstrations, multilingual expression, land- and place-based assignments, and community-engaged projects, the presentation illustrates how culturally aligned assessment practices can strengthen equity, belonging, and student success. Participants will learn cross-cutting themes that emerged across the two contexts and gain practical guidance for designing assessments that honor epistemological diversity. This session is intended for assessment professionals seeking concrete, culturally grounded strategies to advance equity in their own institutions.

Constructing a Science-Based Framework of Personal Responsibility: Insights From a Scoping Review

Megan Lochhead, The University of British Columbia
Personal responsibility appears frequently in institutional outcomes, university mission statements, and degree-level standards, yet its understanding is inconsistent and at times ambiguous. This session shares findings from a scoping literature review that systematically examined how personal responsibility has been defined, conceptualized, and measured across postsecondary contexts. The review revealed conceptual tensions, overlapping constructs, and divergent measurement practices. By presenting a conceptual framework, this session provides attendees with actionable insights to support the development and assessment of personal responsibility within their own institutional contexts.

AI Agents Can Now Navigate and Complete LMS Tasks: A Call for Pedagogical Innovation

Stavros Hadjisolomou, Associate Professor of Psychology, American University of Kuwait
A new category of AI tools, known as AI agents or agentic browsers, can now autonomously navigate learning management systems, read course materials, complete quizzes, and submit assignments without student involvement. This paper presents findings from a live demonstration in which an AI agent completed a 10-question Moodle quiz by reading an uploaded PDF textbook and submitting answers in under 15 minutes. The implications for assessment validity are significant: if students can delegate entire assignments to autonomous software, traditional measures of learning become unreliable. Drawing on recent statements from the Modern Language Association and coverage in Forbes and The Verge, this session examines the emerging discourse on agentic AI in education. Attendees will leave with four design principles for creating assessments that require human presence and cannot be completed by autonomous agents.

Harnessing Artificial Intelligence to Connect Institutional Evidence to Self-Studies

Christopher Drue, Associate Director, Rutgers, the State University of New Jersey
Michele Moser Deegan, Associate VP for Assessment and Accreditation, Rutgers, the State University of New Jersey
Institutions preparing accreditation Self-Studies must integrate mission, goals, and evidence into a coherent narrative, a process that encourages development of insights into organizational practice. In this research project, a vendor-provided AI tool was tested to scan an evidence inventory to determine how well working groups integrated evidence into the Self-Study. Expert reviewers then judged the AI’s recommendations, revealing both useful additions and clear limitations that required human oversight. The findings point to AI as a possible support mechanism in accreditation workflows, but only when paired with strong validation practices and careful integration into institutional processes.

Got Skills? What are Higher Education Assessment Employers Looking For in an Applicant?

Jennifer Morrow, Associate Professor, University of Tennessee
Nicole Jones, Assessment & Accreditation Coordinator, University of Georgia

Andrew Young, Director of Assessment - College of Pharmacy, East Tennessee State University
Samantha Dedrick, Undergraduate Student, University of Tennessee
For this presentation, I will summarize the findings from a project where we conducted a content analysis of one years’ worth of Higher Education Assessment (HEA) Job Ads. I will describe the purpose of the project, the process we used to select the job ads, the steps we conducted for our content analysis, a summary of the results, and suggestions for both instructors of emerging assessment professionals and those considering applying for a job in this field. Audience members will walk away with a better understanding of what are the needed skills, competencies, and dispositions to be successful in this field.

Scaling up, Stressing Less: A Peer-Feedback Model for Assessment Capacity Building

Yao Hill, Specialist (Professor-Rank faculty), University of Hawaiʻi at Mānoa
Justin Walguarnery, Assistant Professor, University of Hawaiʻi at Mānoa

Alice Tse, Professor, University of Hawai‘i at Mānoa
Lia O'Neill M.A. Keawe, Associate Professor, University of Hawaiʻi at Mānoa
University of Hawai‘i at Mānoa’s implemented a scalable, faculty-led meta-assessment and peer-feedback model to address workload challenges and improve the consistency of program assessment feedback. Forty trained reviewers used a culturally grounded rubric and structured process to provide formative, strengths-oriented feedback to about 150 assessment reports, supported by training and examples. This session shares the design, tools, and outcomes of this model and offers practical strategies for building assessment capacity on campus.

Assessing Cross-Disciplinary General Education Outcomes Using Canvas

Colleen Flewelling, Associate Dean of Academic Assessment and Development, Cecil College
Julie Eller, Instructional Design and Technology Coordinator, Cecil College

In this session, Cecil College assessment and instructional technology staff will discuss our solution for challenges faced in collecting data and reporting on cross-disciplinary general education learning assessment. Tools in our new learning management system, Canvas, allowed us to develop a system that balanced standardization with flexibility and ease of use for faculty, while providing a data set that we could use for disaggregating and reporting on summary data. We will discuss working with faculty to develop shared assessment tools, implementing the tools in Canvas, and cleaning and analyzing the data to produce campus-wide assessment reports with disaggregated data.


SESSION POSTERS: 

A Replicable Approach to Teamwork Assessment Using Rubrics and Low-Cost Digital Tools for Practice-Based and Resource-Constrained Contexts

Janet Esquirol, Associate Professor, City University of New York (CUNY), Borough of Manhattan Community College (BMCC)
This poster presents a practical model for assessing teamwork using a redesigned rubric, low-cost digital tools, and a scalable workflow that supports multi-section and multi-program implementation. The project began with the revision of a teamwork learning outcome and the creation of a rubric emphasizing transparency, equity, and observable behaviors. A digital workflow was built using widely available online forms and dashboards to streamline data collection, support faculty calibration, and generate visual summaries for program-level analysis. The poster highlights lessons learned, sample dashboard views, and strategies for adapting the model across disciplines and institutional contexts. Attendees will gain a replicable structure for improving teamwork assessment without requiring specialized software or significant resources.

Scoping Review of Student Course Evaluations on DEI Curriculum: Issues and Successes Across Graduate Health Professions

Faris Fazal, Medical Student, Oregon Health & Science University 
Constance Tucker, Vice Provost of Education/Innovation, Oregon Health and Science University
This session presents findings from a scoping review exploring how student course evaluation feedback is used to inform DEI-related curricular change in graduate-level healthcare education. A total of 735 records were identified across PubMed (318), Scopus (318), and Ovid (99), with 17 studies ultimately meeting inclusion criteria. These studies spanned medical, nursing, physician assistant, and other graduate health professions programs. The review highlights how student feedback, particularly related to diversity, equity, and inclusion, is often underutilized in curricular assessment. Key themes include growing efforts to formalize DEI-focused feedback tools, inconsistency in how institutions respond to student input, and the role of students in pushing forward a more inclusive, equity-centered curriculum. Findings will support the development of a student-centered DEI curriculum evaluation framework at Oregon Health and Science University (OHSU) School of Medicine.

Building Community, Finding Purpose: One Member’s Journey Through AALHE

Bobbijo Grillo Pinnelli, Associate Dean of Assessment, Walden University
This asynchronous poster session explores one member’s evolving journey through the Association for the Assessment of Learning in Higher Education (AALHE) and how meaningful connections, collaboration, and innovative opportunities helped to shape her professional identity. Through key milestones—including early engagement, cross-institutional partnerships, mentoring relationships, and leadership opportunities—the poster session illustrates how AALHE fosters personal growth and professional development. Attendees will gain insight into how community-driven interactions within AALHE can spark collaboration, strengthen assessment practice, and create pathways for continued learning and leadership across the field.

Using TFAM Sessions with Pre-service Teachers to Improve Classroom Assessment Practices

Lance Piantaggini, Ph.D. Candidate, University of Massachusetts Amherst
This poster reports on a study introducing a novel framing of classroom assessment for 27 graduate students enrolled at a large New England university. Using a quasi-experimental design with a non-equivalent comparison group, initial findings show increased understanding and use of formative assessment. The goal of this poster is for attendees to walk away able to implement TFAM sessions used to assess pre-service teachers' understanding and use of classroom assessment.

PowerBI for Program Assessment: Overcoming Constraints, Enhancing Insights

Kristoffer Rees, Senior Assessment Manager, Indiana University Online
This asynchronous poster presentation provides an overview of how PowerBI can be used to generate insights from multiple semesters of program-level assessment outcomes using datasets generated from existing workflows. In addition to facilitating long-term tracking of assessment outcomes, this approach facilitates dashboards that provide snapshots of assessment trends at the program, course, and assignment levels. The primary goal is for session attendees to leave with a clear idea of how PowerBI could be used to not only reveal new insights, but also to drive faculty and other stakeholder engagement in pursuit of continuous improvement of decentralized academic programs.


 

Day 1 Sessions (June 9th):

10:00 AM EDT/ 9:00 AM CDT/ 7:00 AM PDT
EARLY BIRD NETWORKING: Coffee Session

Start your conference day with connection and conversation! Join us for a lively Morning Coffee Networking Hour designed specifically for our online attendees. Grab your favorite morning beverage, turn on your camera (if you’re comfortable), and jump into a series of short, guided conversations that make it easy to connect. This interactive session features light, structured speed-networking rounds to help you quickly meet fellow professionals, exchange ideas, and expand your network, all in a low-pressure, energizing environment.

11:00 AM EDT/ 10:00 AM CDT/ 8:00 AM PDT
OPENING CONFERENCE PRESENTATION: AALHE President

Fiona Chrystall, President, Association for the Assessment of Learning in Higher Education
Kick off the conference with a warm welcome from the President of the organization as we gather to begin our shared exploration of this year’s theme: Assessment in Action: Real Work, Real Community. This opening session will set the tone for the conference by highlighting the essential role assessment plays in strengthening student learning, improving institutional practices, and building meaningful professional connections. Additionally, the session will quickly go over the key navigation points of the Whova platform. 

12:00 PM EDT/ 11:00 AM CDT/ 9:00 AM PDT
NETWORKING OPTIONAL: Coffee Break

Take this opportunity to step away and let your eyes rest! A general networking room will be open for those who want to stay engaged. 

12:15 PM EDT/ 11:15 AM CDT/ 9:15 AM PDT
CONCURRENT SESSION #1:

Building a Specialized Accreditation Assistant Custom GPT 

Joshua Morrison, Director of Academic Retention Programs, University of Indianapolis
This session focuses on developing a custom GPT (e.g. Google Gem, OpenAI's ChatGPT Custom GPT) for the purposes of assisting in preparing specialized accreditation reports and documentation. By developing custom GPT tools, the process to prepare documentation for specialized program accreditors, such as ACOTE (Occupational Therapy) or AACSB (Business) can be accelerated and improved. Attendees will develop enhanced skills in creating custom GPTs through the development of external instruction files, specialized knowledge files, and related documentation to ensure the GPT functions optimally, and provides assistance to those charged with maintaining institutional program-level accreditation.

  • Target Audience: All Levels

Do the Opposite: The George Costanza Approach to Overcoming Barriers and Changing Assessment Culture

Meg Joseph, Associate Director, Student Learning Assessment, Fashion Institute of Technology, SUNY
When sadsack George Costanza of the TV series Seinfeld decided to do the opposite of what he would normally do at the end of Season Five, his whole life changed for the better. In this session, the convener will share how “doing the opposite” is displacing entrenched structural, political, and philosophical barriers and positively transforming the assessment culture at her mid-sized college. Through active reflection and discussion, new and experienced participants will have the opportunity to consider the established yet ineffective assessment practices at their own colleges, envision antithetical actions that could promote positive cultural change, and develop tools of cooperation to facilitate change implementation. The goal is for everyone to walk away with demonstrated albeit adaptable frameworks to systematically implement this pragmatic approach in their own assessment environments.

  • Target Audience: All Levels

Slice, Bin, Drill Down: Applying Critical, Reflexive Strategies to Data Visualization in Assessment

Melissa Ko, Assessment & Curriculum Design Specialist, UC Berkeley
Assessment data rarely get to “speak for themselves.” The way we process and visualize learning outcomes assessment data shapes the stories our audiences construct about student learning experiences. This interactive workshop encourages participants to grow their data literacy by examining how choices in data visualization influence interpretation of the results of assessment. Most importantly, these choices can impact transparency, trust, and equity-mindedness amongst instructors and leaders. To illustrate these principles, participants will compare diverse visualizations (e.g. barplots, boxplots, linegraphs, pie charts) of the same underlying data, realistically modeled as a sample of rubric-scored student work across multiple courses in a curriculum. Through guided small-group discussion, participants will identify the values and assumptions embedded in these visualizations and the tradeoffs of each design choice. This session invites participants to reflect on how the way they visualize data may foreground particular narratives while obscuring others.

  • Target Audience: All Levels

Reliable, Realistic, Repeatable (3R) Model for General Education Assessment Building

Yan Cooksey, Director of Assessment, Southern Methodist University
General education assessment often produces more frustration than improvement: too many artifacts, unclear rubrics, inconsistent scoring, and limited evidence that results drive changes. This skill-building workshop introduces the Reliable, Realistic, Repeatable (3R) Model, a scalable framework for creating a sustainable multi-year general education assessment cycle anchored in faculty engagement, reliable scoring, and actionable results. Drawing on five years of implementation at a mid-sized institution, facilitators will outline the 3R Model’s core components: a staggered cycle, a faculty-centered rater training approach, and a streamlined data-to-decision process that helps programs identify focused, trackable improvements.

Participants will practice a mini-norming activity using a rubric excerpt and sample student work, interpret a concise set of disaggregated results, and complete a data-to-decision table to plan targeted actions. The workshop emphasizes scalability; while the host institution funds faculty raters, the 3R Model itself can be implemented with low-cost or no-cost approaches.

Attendees will leave with adaptable tools such as a cycle map, norming protocol, and quick-start checklist, to advance general education assessment in contexts with varying levels of institutional resources.

  • Target Audience: All Levels

Building a High-Value Feedback Process: Lessons from a Multi-Rater Meta-Assessment Initiative

Rebecca Gibbons, Director of Disciplinary and Institutional Accreditation, Embry-Riddle Aeronautical University
Karen Pain, Executive Director, Academic Impact & Success, Embry-Riddle Aeronautical University

The session will build shared expertise focused on the conceptual, logistical, and interpersonal components of building a sustainable feedback (meta-assessment) process. While academic programs rightly focus on closing the loop within their own learning-outcomes assessment cycles, the assessment office plays a parallel role by evaluating the strength of program-level reports and offering targeted guidance for improvement. High-value feedback can increase faculty confidence, improve alignment with institutional standards, and build a stronger culture of evidence-based decision-making. The session will enhance knowledge by showcasing the successful meta-assessment strategy at Embry-Riddle Aeronautical University and build expertise by facilitating co-creation of knowledge of effective feedback delivery for targeted audiences.

  • Target Audience: All Levels

Compliance vs. Quality in Student Learning Assessment: Using AI to Transform Feedback Reports

Rachel May, Associate Director of Assessment, Louisiana State University 
In many institutions, assessment reporting has become a compliance exercise, focused on meeting accreditation requirements rather than enhancing teaching and learning. This session details how using generative can be used not only to save time but also to shift the focus of assessment reports from box-checking to providing meaningful feedback. The presentation outlines a practical workflow for using AI to draft student learning assessment feedback reports while preserving academic judgment and institutional context. It walks through how AI tools can help structure reports, summarize evidence, and suggest actionable recommendations. Examples are drawn from real assessment cycles in a public university setting, showing both the benefits and the limitations of this approach.

  • Target Audience: All Levels

Tracing Indirect Impact: Assessing the Impact of Academic Support Units on Student Learning

Nicole Espinoza, Director of Assessments, Nevada State University
Morgan Iommi, Director, Center for Teaching and Learning Excellence, Nevada State University
This interactive session showcases how academic support units, from the Writing Center to the Teaching Center, are reimagining assessment to capture the indirect and often overlooked ways they elevate student learning. Led by the university’s assessment director and faculty center director, the workshop highlights creative strategies, real-world challenges, and surprising successes in adapting traditional assessment models to complex, student-centered work. Through guided discussion and practical examples, participants will leave with adaptable tools and fresh ideas for designing meaningful, actionable assessment in their own contexts.

  • Target Audience: All Levels 

From Mocktails to Mastery: Practice-Based Strategies for Assessment Engagement

Morgan Cates, Assessment Program Assistant for the Division of Undergraduate Education and Student Success, University of Oregon
Will Hamilton, Director of Data and Assessment for the Division of Undergraduate Education and Student Success, University of Oregon

This session explores how a creative, practice-based activity, a mocktail competition, was used to engage staff in understanding and applying assessment principles. Under the guise of a fun competition, we taught college unit leaders how to develop measurable outcomes, collaboratively negotiate measures of success, and use a rubric (they created) to assess each other’s creations. By blending fun with professional development, the initiative fostered collaboration, deepened expertise, and demonstrated how experiential strategies can make complex assessment concepts accessible and memorable. Attendees will gain practical ideas for designing similar interactive approaches that build confidence and competence in assessment practices.

  • Target Audience: All Levels

1:15 PM EDT/ 12:15 PM CDT/ 10:15 AM PDT
NETWORKING SNACK BREAK: Climbing the Ladder

Take a break, grab a snack, and connect with peers who are at a similar stage in their professional journey! During this interactive networking session, participants will self-select into breakout rooms based on experience level:

  • Entering / New to Assessment – Just getting started? Connect with others as you build foundational knowledge and navigate early challenges.
  • Middle-Level Builders – Growing your skills and leading projects? Share strategies, tools, and lessons learned.
  • Veterans of the Trade – Seasoned practitioners ready to mentor, exchange advanced practices, and explore big-picture impact.

Whether you’re seeking advice, collaboration, or simply community, this structured yet relaxed networking break offers space for meaningful dialogue across every stage of the ladder.

1:30 PM EDT/ 12:30 PM CDT/ 10:30 AM PDT
CONCURRENT SESSION #2:

Reverse Engineering Assessment: Using AI to Design Meaningful Evaluations from Learning Outcomes

Betsy Tuma, Assistant Professor, Pikes Peak State College
Desiree Bowlby, Program Manager, Technical and Professional Studies, Pikes Peak State College
This session introduces a practical, AI-supported approach to backward assessment design that begins with course learning outcomes and works in reverse to create meaningful, aligned assessments. Many faculty understand the theory of backward design but struggle to consistently translate outcomes into authentic measures of learning, especially under time and workload pressures. In this practice-based session, the presenter shares a campus-tested process that uses generative AI as a thinking partner—not a shortcut—to analyze learning outcomes, generate draft assessment ideas, and check alignment with intended knowledge, skills, and abilities. Participants will experience the protocol used with faculty on the presenter’s campus, then apply it to sample or self-provided outcomes to design their own assessment concepts. The session emphasizes faculty agency, transparency about AI’s limitations, and strategies for maintaining academic integrity and disciplinary nuance. Attendees will leave with a replicable workflow, prompt templates, and facilitation tips they can adapt for their own courses, programs, or faculty development initiatives.

  • Target Audience: All Levels

Democratizing Engagement and Holistic Assessment in Whole Class Discussions

Selina Marcille, Assistant Professor of English, Southern New Hampshire University
Attend this session to learn about creative tools and strategies to facilitate in-class discussions that foster robust engagement, promote critical thinking, and showcase diverse perspectives, while also providing a comprehensive assessment of student learning and engagement.  This tool addresses barriers to participation for underrepresented or less vocal students, resulting in a noticeable improvement in discussion equity that prioritizes assessment of student knowledge instead of willingness to raise their hand and speak up in class.  Using a Socratic Seminar model and AI facilitation technology, participants will have the opportunity to use the tool, develop a plan for use, and discuss implementation in their own classes.

  • Target Audience: All Levels

Mapping for Meaning: A faculty-centered process for implementing institution-wide curriculum mapping

Elizabeth 'Liz' Chase, Director of Academic Assessment, Emerson College
Tyler Rowe, Academic Assessment Designer, Emerson College

Sariva Goetz, Associate Professor, Performing Arts, Emerson College
Carol Ferrara, Assistant Professor, Marketing Communication, Emerson College
Sharifa Simon-Roberts, Assistant Professor, Communication Studies, Emerson College
This session details a faculty-centered process for implementing institution-wide curriculum mapping at Emerson College, ensuring the maps serve as a tool for curricular decision-making. Our process is designed to address the specific concerns and needs of arts and communication faculty. Following an initial outcomes revision phase that leveraged Fink's Categories for Significant Learning to engage faculty, the Office of Academic Assessment (OAA) launched a strategic mapping initiative. We will share our successful multi-stage methodology: OAA used Google Sheets to draft preliminary course to program outcome alignments, followed by a faculty Qualtrics survey for initial data collection. OAA then facilitated faculty discussion/deliberation to finalize the maps. Attendees will learn practical strategies for utilizing low-tech/high-impact mapping tools, resolving discrepancies through facilitation, and framing these detailed maps as an essential resource for faculty autonomy and strategic curricular review, setting the foundation for future integrated assessment implementation.

  • Target Audience: All Levels

Culture of Continuous Improvement and Change

Esha Chatterjee, Associate Director, Curricular Assessment, Santa Clara University
Divya Bheda, Director of Educational Assessment, Santa Clara University

Kyle Amore, Associate Director, Co-curricular Assessment, Santa Clara University
In many institutions, assessment is still perceived as a compliance requirement rather than a meaningful driver of student success. This session examines how to transform that mindset by intentionally cultivating a culture in which assessment is understood as a scholarly, collaborative, and faculty and staff-empowering practice. We focus on the cultural shift that occurs when assessment is framed not as an externally imposed obligation, but as an activity that directly supports teaching, learning, equity, student success, and institutional mission.

  • Target Audience: All Levels

Turning recommendations into results: A practical approach to building assessment systems from scratch

Sarah Jacobs, Director of Assessment, Clark College
Cecelia Martin, Associate Vice President of Planning & Effectiveness, Clark College

This presentation highlights Clark College’s journey in establishing a cohesive, integrated system for programmatic assessment. It details practical steps taken, including creating simplified, faculty-friendly templates, providing targeted interdepartmental training to foster a data-informed culture, and implementing a manageable “closing the loop” process. The approach emphasizes using assessment as a tool for continuous improvement rather than as an administrative burden. Ultimately, these efforts aim to make assessment meaningful, sustainable, and impactful across the institution.

  • Target Audience: All Levels

Assessing Faculty Compliance: Practical Tools for Evaluating Pregnancy/Parenting Support in Colleges

Laura Owens, Account Executive, 16-year veteran psychology professor, Symplicity, UT Tyler (previous)
Teclesha Blanchard, J.D., Director, Equal Opportunity and Title IX Student Center, College of the Mainland

Colleges face increasing compliance obligations related to protections for pregnant and parenting students, yet most institutions lack systematic methods to assess whether faculty are actually implementing required supports. This session introduces a practical, evidence-based framework that integrates regulation standards with assessment tools to evaluate faculty compliance and its impact on student learning and achievement. Presenters will share a replicable model used, including faculty behavior rubrics, reporting metrics, and strategies for closing the assessment loop. Participants will leave with ready-to-use templates for measuring compliance, identifying barriers, and improving equitable learning outcomes.

  • Target Audience: All Levels

Building Readiness in Assessment Partnerships: Practical Strategies for Relationship-Centered Leadership

Emilie Clucas Leaderman, Dean of Academic Pathways and Learning Innovation, American International College
Gina B. Polychonopoulos, Associate Professor & Department Chair, The Chicago School

This interactive skill development workshop equips assessment professionals with practical, research-informed strategies for meeting faculty and staff partners where they are and facilitating movement toward genuine engagement in assessment practice. Drawing from the Readiness component of the RARE Model, participants will learn to recognize readiness indicators, apply three core strategies through case analysis and breakout practice sessions, and develop personalized action plans for implementation. The session balances hands-on strategy development with brief self-awareness activities that help participants identify personal patterns that might interfere with effective application in their institutional contexts.

  • Target Audience: All Levels

EMBRACE-ing Assessment to Advance Access: Cultivating Trust, Collaboration, and a Culture of Excellence

Janelle Coleman, Ph.D., Executive Director of Assessment and Evaluation, University of Tennessee
Jalen Blue, Assistant Director for Assessment and Evaluation, University of Tennessee

Shiloh Lovette, Assessment Coordinator, University of Tennessee
Urmila Pandey, Data Analyst, University of Tennessee
Our Assessment and Evaluation team collaborated to develop the EMBRACE model, which forms the foundation of this skill-building session focused on fostering a culture of assessment in a newer unit. EMBRACE stands for Enacting Methodologies to Bridge Reflection and Assessment to Center Excellence; this model explains the challenges of assessment professionals and stakeholders and provides mechanisms for supporting both parties through reflective practice, open communication, and resource sharing. The main idea behind the EMBRACE model is to reduce apprehension to assessment practice by understanding the needs of stakeholders, partnering with them to address those needs, and promoting a sense of agency through skill-building and clear communication. The session shares practical applications of the EMBRACE model, including person-centered and reflective assessment strategies and approaches we have used to socialize the model through accessible professional development opportunities and a symposium. Participants will also gain actionable tools to implement these strategies and strengthen assessment practices in their own contexts.

  • Target Audience: All Levels

2:30 PM EDT/ 1:30 PM CDT/ 11:30 AM PDT
NETWORKING OPTIONAL LUNCH BREAK: Special Interests and Hobbies

Grab your lunch, take a breath, and if you're up for it, recharge with connection! This relaxed networking hour invites you to step away from the computer if needed or step into breakout rooms centered around shared interests and hobbies, the things that help keep us energized and grounded. If you have ideas or want to host a special breakout room, let us know in the Whove chat feature!

3:30 PM EDT/ 2:30 PM CDT/ 12:30 PM PDT
CONCURRENT SESSION #3: 

Assessment Commons Online Resource Hub

Patti Gregg, Pseudo-retired, Independent Evaluation Consultant
The Assessment Commons Task Force
AALHE is delighted to announce the launch of our long-awaited Assessment Commons online resource hub.  Building on a previous website created by Bob Pacheco and gifted to AALHE for appropriate stewardship, the Assessment Commons Task Force has been hard at work over the past year to update and reimagine Bob's work as a comprehensive, searchable online resource for "all things assessment".  Join us as we introduce you to this new resource and help us continue to develop and build this site to be the "go to" for anyone working in, or interested in, assessment in higher education.

  • Target Audience: All Levels

Turning Insights into Impact: Increasing Equity in an Online First-Year Course to Support Student Success

Teresa Leary Handy, Program Chair/Assistant Professor, University of Arizona Global Campus
Tricia Lauer, Vice President, Assessment Curricular Affairs, University of Arizona Global Campus

Renee Stuart, Assessment Specialist, University of Arizona Global Campus
This panel will explore how course-level learning outcomes data can be transformed into actionable and equitable design decisions for online first-year courses. As institutions work to improve student success and close equity gaps, leveraging assessment data efficiently and effectively is critical. The panel consists of a Vice President, Assessment Curricular Affairs, an Assessment Specialist, and a Program Chair who will share practical strategies for identifying early indicators of student success and struggle, ways to find balance between formative and summative assessments, and how to design assessment tools that promote equity for non-traditional adult learners. Through real-world examples, attendees will see how revising assignments and embedding targeted student supports improved student outcomes and completion of assignments in a first-year course. The session will also highlight how adjunct faculty were supported in understanding norming, rubric calibration, and data interpretation, thus ensuring shared ownership of continuous improvement in the program.

  • Target Audience: All Levels

Success for all Students: How fairness and transparency in assessment contribute to improved student outcomes

Sesime Adanu, Associate Vice President for Institutional Effectiveness, Community College of Philadelphia
Karen White-Goyzueta, Southern New Hampshire University

Sharon Stoerger, Rutgers, The State University of New Jersey
Tawanda Paul, Northern Illinois University
Michelle Searles, Howard University
Lucia Santacruz, Bowie State University
During this session, the attendees will learn about the newest chapter of the field guide (chapter 5) which is focused on transparency in assessment. The panelist will share practical experiences and examples on how they have used transparency to improve the teaching and environment. They would also address the usefulness of rubrics in nurturing equitable outcomes for students of diverse backgrounds. The discussions will be guided by structured questions from the moderator on equity-minded assessment grounded in fairness and transparency. The participants will have the opportunity to ask practical questions that impact equity-minded assessment at their institutions and solicit solutions to addressing them. At the end of the session, participants will be equipped with the resources needed to promote equity-minded assessment at their respective institutions.

  • Target Audience: All Levels

Assessment That Matters: Faculty Engagement, Program Success, and the Reality of Doing the Work

Angie Miller, Director of Assessment, Kansas City Kansas Community College
Carlee Ranalli, Dean of Planning and Institutional Effectiveness, Hagerstown Community College

Nancy Parks, Associate Vice President of Student Services, Pierpont Community & Technical College
Terri Flateby, Higher Education and Assessment Consultant, TL Flateby and Associates
Laura De La Cruz, Professor and Department Chair, Business and Hospitality Services, Dona Ana Community College
This panel will explore practical strategies for developing assessment practices that enhance course quality and align with Quality Matters (QM) and OSCQR standards. Panelists will share valuable lessons, effective methods for fostering faculty engagement, and ways to integrate assessment into daily routines. Attendees will gain strategies, examples, and tools to improve learning outcomes and promote a culture of continuous improvement

  • Target Audience: All Levels

The Secret Lives of Assessment Professionals, A Dialogue 

Erica Eckert, Assistant Professor, Kent State University
This 60-minute Dialogue session shares results from a qualitative study of assessment professionals conducted at the 2024 Assessment Institute and invites participants to collectively interpret the findings and generate implications for their own contexts. Through semi-structured interviews, participants described the realities of assessment work—often operating in small or solo units, navigating invisible labor, managing heavy workloads, influencing without authority, and serving as organizational translators and culture-change agents. Despite these challenges, practitioners articulated a deep commitment to integrity, student learning, and removing barriers for their communities. Many “fell into” assessment but stay because the work feels meaningful and intellectually engaging.

  • Target Audience: All Levels

4:30 PM EDT/ 3:30 PM CDT/ 1:30 PM PDT
NETWORKING BREAK: What’s Your Role

Sometimes you just need someone who truly gets it. This focused networking break connects you with professionals serving in similar roles at other institutions, whether you’re faculty, assessment lead, institutional research, academic affairs, student services, administration, or somewhere in between. Step into a breakout room aligned with your role and engage in candid conversation with peers who are navigating similar responsibilities, pressures, and opportunities. 

4:45 PM EDT/ 3:45 PM CDT/ 1:45 PM PDT
CONCURRENT SESSION #4:

AI in Action: How to Create and Use a Custom GPT for Program Assessment Review

Yan Cooksey, Director of Assessment, Southern Methodist University
This session demonstrates how a custom-designed GPT created in ChatGPT’s Create tool can support consistent program assessment report reviews that are aligned to institutional rubrics.

Participants will see a live demonstration of how to set up and train a Custom GPT (no coding experience is needed), including system instructions, rubric integration, formatting rules, and testing. The session highlights examples of Canvas-ready outputs, narrative feedback structures, and a built-in quality control checklist. Attendees will leave with customizable prompts, a setup guide, and strategies for applying AI-enhanced workflows to their own institutional assessment processes.

This session contributes to the growing field of AI and assessment by offering a practice-based example tested at a mid-sized institution. It also fills a gap in the current literature where most AI research focuses on course-level assessment rather than program or institutional assessment review.

  • Target Audience: All Levels

Building a Data-Enabled APR Cycle: Leveraging High-Impact Practices to Strengthen Academic Quality and Continuous Improvement

Saran Tucker, Manager, Data Enablement and Assessment, Capella University
Jacklyn Zacharias, Senior Assessment Specialist, Capella University

Nancy Ackerman, Assessment Specialist, Capella University
Rimi Bhowmik, Assessment Specialist, Capella University
Linh Dao, Assessment Specialist, Capella University
This session presents a scalable and practice-based approach to the Academic Program Review (APR) process using a High-Impact Practices (HIPs) framework to strengthen academic quality, data maturity, and institutional accountability. Participants will explore a three-year continuous improvement model that integrates assessment, curriculum review, accreditation alignment, and collaborative governance. The session highlights how coordinated workstreams, shared data models, and structured reporting cadences can transform APR from a compliance-oriented activity into an engine for evidence-based decision-making and student success. Attendees will leave with a transferable model for aligning disparate units, creating consistent data practices, and embedding APR findings into strategic institutional planning.

  • Target Audience: All Levels

Reducing Faculty Workload with Generative AI support for Learning Outcomes, Curriculum Mapping, and Rubric creation

Rand Ware, Psychology Instructor/Assessment Coordinator, Lane Community College
Kevin Steeves, Instructional Designer, Lane Community College

This practical workshop will provide assessment professionals and faculty with hands-on practice utilizing Generative AI (GenAI) to increase efficiency in common outcomes assessment processes that represent pinch points in the faculty assessment workload. Participants will learn how to leverage pre-built GenAI tools (GEMs) for three critical, time-consuming tasks: drafting measurable learning outcomes, generating preliminary curriculum maps, and creating initial rubric drafts. The session focuses on demonstrating how GenAI acts as a productivity tool, not a replacement, allowing faculty to spend less time on administration and more time on analysis and improvement. Faculty will be encouraged to bring their own draft outcomes, outcome sets, and rubric needs for experimentation.

  • Target Audience: All Levels

Empowering Institutional Assessment: Best Practices for Website Resource Development 

Rebecca Gibbons, Director of Disciplinary and Institutional Accreditation, Embry-Riddle Aeronautical University
Coral Bender, Assistant Professor, The University of Tampa

Kayla Waggoner, Student Affairs Impact & Success Specialist, Embry-Riddle Aeronautical University
Nancy Anderson, Assessment Analyst, Tulane University
Laurajean Holmgren, Lecturer and Deputy Academic Director, Sports Management, Columbia University
Looking to improve your website resources? Learn more in this hands-on workshop that highlights best practice in assessment resource allocation across institutional websites, based on an empirical study. Participants will collaborate to analyze their own websites utilizing the framework presented by the researchers, followed by meaningful discussions on effectively and efficiently updating websites to align with data-driven best practice.

  • Target Audience: Expert/Veteran Level

From Checkboxes to Crosswalks: Putting Institutional Outcomes to Work in General Education Assessment

Julie Morrison, Psychology Faculty Chair & College Assessment Director, Glendale Community College
Roxane Alexander-Arntson, Communication Faculty & College Assistant Assessment Director,  Glendale Community College

Genea Stephens, Administration of Justice Faculty & College Program Assessment Coach, Glendale Community College
General education redesign is often driven by statewide frameworks, but the real work happens where courses, outcomes, and data intersect in local communities. Aligned with the conference theme, Assessment in Action: Real Work, Real Community, From Checkboxes to Crosswalks presents a case study from a large community college that is using institutional learning outcomes to move General Education assessment beyond compliance checklists toward meaningful crosswalks with the Arizona General Education Curriculum (AGEC), a statewide transfer-oriented general education curriculum. The presenters will share a multi-year path from institutional learning outcome (ILO) adoption to an ILO–Arizona General Education Curriculum (AGEC) crosswalk, including an ILO Commitment Initiative that significantly expanded scored artifacts and illuminated 100/200-level performance patterns. Participants will see how this work is informing priorities for General Education learning and shaping emerging program assessment conversations, and will take away concrete design choices and tools they can adapt to their own contexts and statewide or system-level frameworks.

  • Target Audience: All Levels

New, Meaningful, & Powerful Ways of Engaging in Institutional Learning Outcomes (ILO) Assessments

Divya Bheda, Director of Educational Assessment, Santa Clara University
Esha Chatterjee, Associate Director, Curricular Assessment, Santa Clara University

Kyle Amore, Associate Director, Co-curricular Assessment, Santa Clara University
This session offers a framework to assess institutional learning outcomes such as quantitative reasoning, information literacy, and social justice in co-curricular and curricular spaces in more useful and meaningful ways than our current overreliance on artifact analyses and VALUE rubrics. Participants will explore how assessment practices can be standardized and participatory in their approach whole also being holistic and comprehensive in their design. The models and strategies presented are adaptable across disciplines, student programs, and institutional types, making them highly transferable for campuses seeking to assess ILOs in more sustainable and meaningful ways.

  • Target Audience: All Levels

Changing Minds About Assessment: Using Conceptual Change Theory & Expectancy-Value Insights to Shift Assessment Culture 

Rocky Walker III, Director of Institutional Research & Assessment, Lee University
Assessment in higher education did not originate as compliance work — it began as a faculty-driven effort to better understand and improve student learning. Over time, expanding accountability pressures shifted assessment toward reporting and verification, leaving many faculty to associate it with oversight rather than inquiry. This session traces that historical shift and uses Brown’s (2017) accountability silos and guiding logics to explain why compliance-oriented conceptions persist and why improvement-oriented cultures struggle to take hold.

  • Target Audience: Novice/Entry Level

Assessing and Improving Higher Education: Enduring Principles, Emerging Opportunities

Steven P. Hundley, Ph.D., Professor of Organizational Leadership & Executive Director of the Center for Leading Improvements in Higher Education, Indiana University Indianapolis
Assessing and improving student learning in courses, programs, and other collegiate experiences continues to be a priority for faculty, staff, and instructional partners throughout the higher education ecosystem. Informed from the perspectives of the Assessment Institute in Indianapolis, Assessment Update, and other national and global resources, this interactive session offers important reminders about fundamental principles of assessment and improvement, including planning for learning, implementing evidence informed interventions, assessing and evaluating outcomes, making improvements to instructional contexts, and fostering a shared culture of evidence reliant on distributed leadership for assessment and improvement. Finally, the session will identify emerging assessment trends and provide an opportunity for participants to take stock of current assessment strategies and activities in their respective contexts.

  • Target Audience: All Levels

5:45 PM EDT/ 4:45 PM CDT/ 2:45 PM PDT
NETWORKING OPTIONAL BREAK: Step Away

Take this opportunity to step away and let your eyes rest! A general networking room will be open for those who want to stay engaged. 

6:00 PM EDT/ 5:00 PM CDT/ 3:00 PM PDT
DAY ONE CLOSING SESSION: Special Presentation 

Real Work, Real Community: A Collaborative Mentorship Model for Assessment Practitioners

Lisa Bortman, Chief Executive Officer, Mentorship Collaborative
Fiona Chrystall, President, Association for the Assessment of Learning in Higher Education

Sarah Jacobs, Director of Assessment, Clark College
Assessment professionals play a critical role in promoting institutional effectiveness and student success. Yet many work as “teams of one,” experiencing limited collaboration opportunities, unclear career pathways, and a sense of isolation within their home institutions. These challenges negatively impact workforce retention and create barriers to ongoing professional growth.National organizations such as AALHE provide vital spaces for networking, leadership, and professional development across institutional boundaries. However, sustaining these relationships often falls to individuals already balancing demanding workloads. As a member-driven association, AALHE must also navigate the ongoing challenge of supporting its volunteers, nurturing leadership pipelines, and ensuring members feel connected to the broader work of the organization.In Fall 2025, AALHE partnered with the Mentorship Collaborative, an LLC specializing in mentoring for higher education professionals, to pilot a structured mentoring initiative. The program engaged committee leaders in a 12-week community of practice designed to strengthen professional networks and address assessment-related challenges through group activities using shared case studies grounded in organizational theory and leadership practice. Nine of ten original participants completed the series, and feedback from the group informed program reflection and refinement.

  • Target Audience: All Levels

7:00 PM EDT/ 6:00 PM CDT/ 4:00 PM PDT
HAPPY HOUR!

Grab your favorite beverage and join us after the closing session to talk about how things are going, what keeps us going, and look to the future. Whether you want to reflect on key takeaways, meet new colleagues, or simply enjoy a refreshing beverage, this end-of-day social is designed to foster authentic connections and recharge your energy for Day Two.


 

Day 2 Sessions (June 10th):

11:00 AM EDT/ 10:00 AM CDT/ 8:00 AM PDT
EARLY BIRD NETWORKING: Coffee Session

Start your conference day with connection and conversation! Join us for another lively Morning Coffee Networking Hour designed specifically for our online attendees. Grab your favorite morning beverage, turn on your camera (if you’re comfortable), and jump into a series of short, guided conversations that make it easy to connect. Bring one idea that’s working well at your institution or one challenge you’re still trying to solve!

12:00 PM EDT/ 11:00 AM CDT/ 9:00 AM PDT
CONCURRENT SESSION #5: 

Designing Human-Centered Assessment in an AI World: Strategies for Preserving Student Voice and Authentic Learning 

Laura De La Cruz, Professor/Dean, New Mexico State University - Dona Ana
This session focuses on how to design authentic, human-centered assessments that preserve student voice in an era of widespread AI use. Participants will learn a practical framework for identifying AI-vulnerable tasks and strengthening alignment between assignments and learning outcomes. The session will highlight strategies that elevate reflection, personal experience, and decision-making so assessments remain meaningful and valid. Attendees will leave with concrete redesign techniques and templates they can apply immediately within their own courses and programs.

  • Target Audience: All Levels

Designing for Success: Intentional Assessment of General Education Outcomes 

Willie Ho, Assistant Director of Academic Program Review and Assessment, University of Illinois Chicago
D. Scott Tharp, Director of Academic Program Review and Assessment, University of Illinois Chicago

General education programs often provide holistic and workforce-oriented development for students as they consider their futures. The assessment of General Education learning outcomes is important to ensure that courses are meeting the expected learning outcomes of the program. However, assessment of General Education with multiple categories and outcomes can have many challenges and can provide uninformative data.  This session will address assessment of a multi-categorical General Education curriculum and how the University of Illinois Chicago (UIC) transitioned from a random sampling methodology to an intentionally designed assessment process that directly measures all stated General Education learning outcomes. Additionally, this session will speak on the benefits and challenges of undertaking a meta-analysis of all outcomes of a general education program and demonstrate how the findings from a meta-analysis at UIC will help to inform teaching, courses, and the general education program. 

  • Target Audience: All Levels

Beyond Outcomes: Assessing the Process of Problem Solving Using Process Education

Joshua Morrison, Director of Academic Retention Programs, University of Indianapolis
Many assessment strategies focus on learning outcomes based on the artifacts students produce. As generative artificial intelligence (GenAI) enables learners to produce text without the requisite learning, reasonable questions have been raised about how faculty can accurately assess student learning and growth. As a way to mitigate this concern, what if we assessed the learning process itself? This session introduces participants to a Process Education-based approach to assessing problem solving as it unfolds. Drawing on tools like performance rubrics, self-assessment protocols, and coaching prompts, this interactive session helps faculty design assessments that reveal how students reason, reflect, adapt, and grow. Attendees will practice applying structured assessment strategies to real-world student scenarios, with takeaways they can adapt to their own programs immediately. No GenAI required.

By utilizing this approach, faculty accrue at least two benefits. First, students obtain actionable feedback on their performance, encourages a growth mindset, and helps students become more self-directed and self-regulated learners. Second, faculty obtain deeper insights into where students struggle, how to intervene earlier, and how to build instruction that develops critical thinking skills, not just task completion.

  • Target Audience: All Levels

From Compliance to Curiosity: Faculty Development that Builds a Culture of Assessment

Courtney Vengrin, Director of Assessment, Geisel School of Medicine, Dartmouth College
Jessica Myers, Associate Director of Assessment, Penn State University

This interactive workshop explores how institutions can build meaningful faculty development opportunities that strengthen assessment literacy and cultivate a sustainable culture of assessment. Drawing on evidence from across higher education, the session focuses on examining beliefs about assessment, addressing common skill gaps, and how assessment offices can work to equip faculty to engage in practices that support learning. Participants will gain tools to analyze their institutional contexts, engage in collaborative design activities, and leave with actionable strategies for creating faculty development programs that foster shared understanding, reflective practice, and long-term cultural change.

  • Target Audience: All Levels

The assessment hero's journey: Sharing and developing stories from our work

Sarah Jacobs, Director of Assessment, Clark College
Sarah Drummond, Director of Research and Assessment, PA program, Oregon Health & Science University
Using assessment data to provide insight, inspire change, or document history is best practice. Storytelling is a powerful tool which helps humans digest and understand complex information. In this session, we merge these two concepts by walking you through the parts of telling your assessment hero’s journey – to make meaning, help others understand the story behind the data, or to meet accreditation requirements. Participants will gain an understanding of the essential parts of a good story and be able to take home resources and information to help lead similar activities in their own workplaces.

  • Target Audience: All Levels

When AI Joins the Assessment Team: Frameworks, Use Cases, and Faculty-Friendly Strategies

Lindsey Brown, Director, Office of Assessment for Curricular Effectiveness, Washington State University
Sara Mahuron, Assessment Specialist, Washington State University

Armine Ghalachyan, Assessment Specialist, Washington State University
Generative artificial intelligence (AI) is evolving rapidly and advances in generative AI are providing new and powerful tools to enhance our work. Yet, for many faculty and assessment professionals, the question remains: How can generative AI be used responsibly and effectively to support program-level assessment? This session will explore real-world strategies for integrating generative AI into program learning outcomes assessment work. Drawing on practical applications from our institution, we will demonstrate how generative AI can streamline assessment tasks to increase capacity for meaning-making and evidence-based decisions, as well as describe our framework for developing and sharing these strategies with our faculty partners. Attendees will engage in activities to consider use cases for generative AI and participants will leave with actionable ideas for responsibly incorporating generative AI into program learning outcomes assessment on their campuses.

  • Target Audience: All Levels

Avengers, Assemble! Strengthening Educational Assessment Through Shared Leadership

Briana Keith, Assistant Dean of Academic Quality, Saint Francis University
Gabriel Keney, Director of Institutional Effectiveness, Saint Francis University

This interactive workshop explores how a unique model for a center for teaching and learning serves as a strategic hub, transforming educational assessment from a compliance-driven task into a culture of collaborative inquiry and continuous improvement. After hearing about a collaborative approach to improving assessment, participants will discuss strategies for engaging diverse stakeholders, aligning assessment with institutional goals, and project management.

  • Target Audience: All Levels

Growing the Field: Preparing Manuscripts for Submission to Intersection

Sarah Wu, Director of Assessment, Georgia Institute of Technology
Fiorella Peñaloza, Assistant Professor, University of Hawaii-West Oahu
Kimberely Nettleton, Director of University Assessment, Planning, Performance, and Effectiveness, Associate Professor, Morehead State University
Amy Heston, Professor of Inorganic Chemistry, Walsh University
Rebecca Gibbons, Director of Disciplinary and Institutional Accreditation, Embry-Riddle Aeronautical University
This workshop will set the stage for everyone in attendance to find their own route to contributing to the assessment literature with AALHE’s official peer-reviewed research journal-Intersection: A journal at the intersection of assessment and learning! This workshop aims to guide attendees in initiating their publication journey with Intersection. Many assessment professionals have academic backgrounds in other fields, so the transition to conducting research and publishing in assessment can be intimidating. The presenters will break down barriers to publication by providing insights from experienced reviewers and editors, highlighting key components to consider when preparing and submitting a manuscript. The workshop will walk attendees through the lifecycle of an Intersection manuscript, from conceptualization, research design, article drafting, submission, review, to publication. By the conclusion of the workshop, participants will gain confidence in their ability to submit a manuscript successfully and/or begin serving as a peer reviewer within the next year, walking away with a documented plan for success.

  • Target Audience: All Levels

1:00 PM EDT/ 12:00 PM CDT/ 10:00 AM PDT
NETWORKING SNACK BREAK: AALHE Committees

Curious about getting more involved with AALHE? This networking break is your opportunity to explore the association’s open subcommittees and discover where your interests and talents can make an impact. Hop from room to room to ask questions from other volunteers currently serving!

1:15 PM EDT/ 12:15 PM CDT/ 10:15 AM PDT
CONCURRENT SESSION #6: 

Beyond the Assessment Office: Career Trajectories for Assessment Professionals

Patti Gregg, Pseudo-retired, Independent Evaluation Consultant
Jennifer Ann Morrow, Associate Professor and Graduate Program Director, University of Tennessee
Kathleen Gorski, Associate Provost of Curriculum and Instruction, Harper College
Fiona Chrystall, Title III Director & Institutional Planner, Asheville-Buncombe Technical Community College
Timothy Melvin, Associate Professor & Director of Assessment, COEPD, Marshall University
Gina B. Polychonopoulos, Associate Professor & Department Chair, The Chicago School
What comes after Director of Assessment? At many, if not most, institutions, promotion into senior leadership roles means a portfolio of responsibilities well beyond (or even exclusive of) assessment practice. Returning to, or transitioning into, full-time faculty roles is another significant culture change. The panelists will discuss their experiences with moving from assessment leadership into broader institutional roles such as strategic planning, institutional effectiveness, or curriculum and instruction; as well as tenure-track faculty appointments and academic leadership roles. They will share the skills and dispositions needed for moving up the organizational hierarchy, and how they have found their professional lives differ in these new contexts.

  • Target Audience: Expert/Veteran Level

Leading Assessment Committees: Strategies for Navigating Universal Mandates in Distinct Campus Contexts

Carla Strickland-Hughes, Assessment Associate, Colorado School of Mines
Colleen Flewelling, Associate Dean of Academic Assessment and Development, Cecil College
Tracey D. Frey, Assistant Vice President for Institutional Effectiveness and Academic Assessment, Loyola University of Maryland
Sandra Hiebert, Director of Institutional Assessment and Academic Compliance, McPherson College
Valerie F. McDaniel, Faculty Fellow of Assessment; Assistant Professor of Speech Language Pathology, University of the Pacific
Is your assessment committee a generic blueprint or a bespoke design tailored to your campus? Assessment committees are the engine of institutional continuous improvement, but “best practices” may seem limited to specific contexts. Join experts from five distinct private and public Practicalinstitutions—including R1 STEM, community college, liberal arts, and religious and non-denominational contexts—to explore how they navigated universal mandates for assessment and adapted their assessment committee's structure, charge, and leadership to fit their institutional cultures. We will dissect critical themes of institutional assessment committee leadership, including strategies for motivating faculty, navigating the tension between compliance and improvement, and the underutilized power of student partnerships. This moderated question-and-answer session prioritizes audience interaction, allowing you to ask questions and receive tailored advice on leading relationship-rich committees.

  • Target Audience: All Levels

E Ala: Awakening the Moʻolelo (Story) of Culturally Relevant Assessment

Alohilani Okamura, Assistant Professor, University of Hawaii Manoa
Keahiahi Long, Assistant Professor, University of Hawaii Manoa
Piilani Kaaloa, Interim Associate Dean, University of Hawaii Manoa
This panel examines how the practice of moʻolelo (story), integrated with Pono Research principles from the [Kūlana Noiʻi framework], can strengthen culturally responsive and equitable program assessment. Drawing on insights from a recent campus-wide assessment event series, panelists will discuss practical strategies, challenges, and faculty reflections on aligning assessment with institutional learning outcomes centered on ethical, responsible, and culturally respectful research. Brief audience engagement activities will support participants in identifying pono-grounded practices they can bring back to their own program assessment work.

  • Target Audience: All Levels

Exploring AALHE Member Pathways: Member Panel

Sarah Jacobs, Director of Assessment, Clark College
Kara Moloney, Senior Assessment Specialist, University of California, Davis (Moderator)
Laura Aboyan, Director, Curriculum Management and Assessment & Accreditation, Fox School of Business, Temple University
Meg Joseph,  Associate Director, Student Learning Assessment, Fashion Institute of Technology, SUNY
Terri Flateby, Higher Education and Assessment Consultant, TL Flateby and Associates
Reem Jaafar, Dean for Institutional Effectiveness, Queensborough Community College, CUNY
Members of the Member Engagement Committee (MEC) have drafted their own journeys through the field of assessment. As a way to engage and retain our current members and to support the recruitment of new members, MEC would like to share our stories during a panel discussion at the 2026 AALHE Annual Conference. We will also provide an overview of the entire project to help explain how this fits into AALHE’s broader strategy and our next steps. At the conclusion of the panel, we will invite attendees to submit their own pathways so that we can begin to curate a collection of member journeys that highlight different professional backgrounds, institution type, career length, etc.

  • Target Audience: All Levels

The Power of Sitting Beside (Assidere): Integrating Staff Expertise into Student Leadership Assessment

Shiloh Lovette, Assessment Coordinator, University of Tennessee
Urmila Pandey, Data Analyst, University of Tennessee
Jalen Blue, Assistant Director for Assessment and Evaluation, University of Tennessee
Janelle Coleman, Ph.D., Executive Director of Assessment and Evaluation, University of Tennessee
Join us for a panel discussion on our team approach to assessment centering assidere-- a Latin term meaning "to sit beside"-- as a transformative practice to build a collaborative and empowering relationship amongst staff members. We'll explore how assidere principles fostered trust, shared ownership, and helped us measure meaningful student leadership growth for our LEAD Scholars program. Participants will gain practical strategies to codesign outcomes, strengthen partnerships, and reimagine assessment as a process grounded in mutual responsibility and shared success. 

  • Target Audience: All Levels

2:15 PM EDT/ 1:15 PM CDT/ 11:15 AM PDT
NETWORKING OPTIONAL LUNCH BREAK: Special Interests and Hobbies

Grab your lunch, take a breath, and if you're up for it, recharge with connection! This relaxed networking hour invites you to step away from the computer if needed or step into breakout rooms centered around shared interests and hobbies, the things that help keep us energized and grounded.

3:15 PM EDT/ 2:15 PM CDT/ 12:15 PM PDT
CONCURRENT SESSION #7: 

Considerations for creating and using AI prompts to develop learning assessment materials

D. Scott Tharp, Director of Academic Program Review and Assessment, University of Illinois Chicago
Willie Ho, Assistant Director of Academic Program Review and Assessment, University of Illinois Chicago
Conversational artificial intelligence platforms such as ChatGPT and Microsoft Copilot are often discussed in higher education in the context of instruction and academic integrity; however, their potential role in supporting learning assessment practitioners is relatively underdiscussed. This presentation examines how practitioners might use AI to support the development of assessment materials. Attendees will consider key assumptions and considerations when using AI, learn a process and associated strategies to create useful AI prompts, and explore assessment materials (i.e., learning outcomes, rubrics, curriculum maps) where AI might be useful in their development. assessment workflows, emphasizing the need for critical review to mitigate inaccuracies. Attendees will have time to discuss and apply session content in their own practice on campus with their peers.

  • Target Audience: All Levels

ACCELERATE in Action with Pecha Kucha: Showcasing the Ten Principles

Constance Tucker, Vice Provost, Educational Improvement and Innovation, Oregon Health & Science University
Divya Bheda, Director of Educational Assessment, Santa Clara University
Daniel Kaczmarek, Director of Assessment and Research, University at Buffalo
ACCELERATE: Assessment Principles for Best Practice was created as a contemporary update to the original Nine Principles of Good Practice for Assessing Student Learning (AAHE, 1992, 1996). In this PechaKucha, we bring the principles to life through the Intersection special issue, which features scholar-practitioners applying the ACCELERATE framework in diverse institutional contexts. Through a fast-paced, visually engaging format, we illustrate how the ten principles are enacted to advance learning, equity, and institutional effectiveness. This session highlights stories of collaboration, innovation, and transformation, demonstrating that assessment is not a compliance activity but a catalyst for meaningful change. Participants will leave inspired to identify how these principles show up in their own work and consider ways to contribute to the ongoing ACCELERATE scholarship.

  • Target Audience: All Levels

Beyond Barriers: Practical Strategies for Impact-Focused Assessment in Administrative Units

Naima Wells, Executive Director of Operational Impact and Success, Embry-Riddle Aeronautical University
This interactive workshop bridges empirical research with practical strategies for improving outcomes assessment in administrative and student service units. Drawing from a national mixed-methods study of over 100 higher education assessment practitioners, participants will explore evidence-based insights into the barriers and enablers of effective assessment practice. The session guides attendees through identifying systemic challenges (resources, collaboration, and communication), interpreting data for improvement, and applying adaptive frameworks such as Plan-Do-Check-Act to strengthen institutional assessment culture. Through collaborative activities and reflection, participants will leave with actionable tools to enhance the impact of assessment in their own contexts.

  • Target Audience: All Levels

Practical Strategies for Assessing High-Impact Practices Across Campus

Jun Fu, Michigan State University
Kari Thierer, Northeastern University
Ellen Vujasinović, South Puget Sound Community College
Bryant Hutson, University of North Carolina at Chapel Hill
High-Impact Practices (HIPs) are celebrated for transformative learning, yet their assessment often remains siloed and superficial. This interactive workshop brings together assessment practitioners from four distinct institution types to share concrete, transferable strategies for tackling common assessment challenges. We move beyond theory to present real-world approaches for aligning outcomes, gathering authentic evidence, and fostering collaboration between academic and co-curricular units. Participants will engage in problem-solving activities based on our case studies and leave with a customized set of practical next steps for their own HIP assessment projects.

  • Target Audience: All Levels

Designing academic assessment strategies for interdisciplinary programs

Melinda Lull, Assistant Vice Provost for Assessment, University of Rochester
Molly Ball, Assistant Professor of History, University of Rochester
Interdisciplinary programs are increasing in frequency and popularity in higher education. While these programs are beneficial for institutions and students alike, they often involve several different academic departments and have a curriculum that is complicated.  These factors make the design and implementation of a cohesive assessment plan difficult.  This session will present approaches for creating an effective assessment plan for interdisciplinary programs by describing the creation of an assessment strategy for an undergraduate Latin American Studies program.

  • Target Audience: All Levels

Doing Double Duty: Program Assessment as Scholarship of Teaching and Learning (SoTL) and Faculty Development

Morgan Iommi, Director, Center for Teaching and Learning Excellence, Nevada State University
Although program assessment is often seen as a compliance task, it can also provide a strong foundation for the Scholarship of Teaching and Learning (SoTL) and faculty development, helping instructors turn routine assessment work into meaningful inquiry that supports their professional growth. During this session, we will discuss how to reframe assessment work and collaborate with campus support units to create structures that allow assessment to do double duty, meeting both institutional and faculty needs while building faculty buy-in. By the end of this interactive session, participants will understand how program assessment can support SoTL and faculty development across institutional contexts, gain strategies for effective collaboration among campus support units, and leave with institution-specific action plans for developing or adapting assessment-linked faculty development initiatives grounded in SoTL and collaborative support mechanisms.

  • Target Audience: All Levels

Reframing Career Readiness Through Reflective Assessment: The Mission-Gap Framework

Diondre Brown, Ph.D. Student, University of Tennessee, Knoxville
This interactive workshop introduces the Mission-Gap Framework (MGF) as a reflective assessment model that transforms student reflection into meaningful evidence of learning. Drawing from recently accepted AALHE scholarship, the session reframes career readiness and purpose development as assessable outcomes grounded in clarity, direction, and contribution. Participants will experience the framework firsthand by crafting and analyzing purpose statements, demonstrating how structured reflection can generate valid qualitative data. The session concludes with practical tools, rubrics, and implementation strategies adaptable to advising, courses, and co-curricular programs. Attendees will leave with a replicable model for capturing internal learning outcomes alongside traditional metrics.

  • Target Audience: All Levels

From Logic Model to Dashboard: A Skill-Building Workshop for Designing Full-Cycle Assessment in Online Programs

Lacy Hodges, Director, Learning Analytics & Assessment, Georgia Tech
This workshop offers a practical approach to building coherent, full-cycle assessment plans for online academic programs. Participants will learn a step-by-step method for constructing logic models, identifying measurable indicators from multiple data sources (including SIS, event attendance, and surveys), and translating those indicators into actionable Tableau dashboards. Through guided worksheets and collaborative activities, attendees will practice designing visualizations that clearly link program goals to evidence and stakeholder needs. The session emphasizes hands-on application, peer learning, and adaptable tools that participants can bring back to their institutions to strengthen transparency, data-informed decision-making, and continuous improvement.

  • Target Audience: All Levels

4:15 PM EDT/ 3:15 PM CDT/ 1:15 PM PDT
NETWORKING OPTIONAL BREAK: Step Away 

Take this opportunity to step away and let your eyes rest! A general networking room will be open for those who want to stay engaged. 

4:30 PM EDT/ 3:30 PM CDT/ 1:30 PM PDT
KEYNOTE PRESENTATION

Title: Coming Soon

Kate McConnell, Vice President for Curricular and Pedagogical Innovation and Executive Director of VALUE. AAC&U
Dr. McConnell serves as Vice President for Curricular and Pedagogical Innovation and Executive Director of VALUE, bringing nationally recognized expertise in assessment, teaching, and learning. An educational psychologist by training, Dr. McConnell has written extensively on the reliability and validity of the VALUE approach and consults with campuses across the country on improving teaching and learning while meeting accountability and accreditation expectations. Her work focuses on aligning pedagogy and assessment, reimagining general education curricula, faculty development, and applying the learning sciences to higher education practice. Before joining AAC&U, Dr. McConnell spent a decade at Virginia Tech in assessment and evaluation and served as affiliate faculty in educational psychology. She holds a BA from the University of Virginia, an MA from Providence College, and a PhD in educational psychology from Virginia Tech.

6:00 PM EDT/ 5:00 PM CDT/ 3:00 PM PDT
CLOSING SESSION: AALHE Board of Directors & President 

Join the Board of Directors as they bring the conference to a close with reflections on the conversations, ideas, and connections that emerged throughout the event. This session will highlight key themes, celebrate the collective work of the community, and acknowledge the contributions of presenters, participants, and volunteers. The Board will also share brief remarks on the future of the organization and opportunities to stay engaged moving forward.