EMERGING DIALOGUES IN ASSESSMENT

Impact Narratives: Transforming Assessment from Data Graveyard to Strategic Compass

March 24, 2026

  • Naima Wells, PhD, Executive Director, Operational Impact and Success, Embry-Riddle Aeronautical University

Abstract

Assessment in higher education is often experienced as compliance-driven reporting rather than strategic guidance. This article introduces Impact Narratives and Intentional Design for Institutional Effectiveness, practitioner-developed frameworks that reposition assessment as a relational and strategically aligned practice. By translating local evidence into institution-wide insight, the approach supports evidence-informed decision-making, equity attention, and sustained institutional improvement.

 

Impact Narratives:
Transforming Assessment from Data Graveyard to Strategic Compass

For many in higher education, the word assessment signals compliance rather than direction. Reports are completed, templates are submitted, and accreditation expectations are met, yet institutional decisions often unfold with little visible connection to the evidence collected. Despite decades of scholarship advocating cultures of evidence and improvement (Schuh, 2013; Walvoord, 2023), assessment is frequently described as disconnected from everyday operational realities (Schuh, 2013; Elkins, 2015). The problem is not a lack of data. It is a lack of shared meaning across institutional levels.

Part of this disconnect stems from an assumption that often goes unexamined in assessment work: if assessment professionals understand the process, others must as well. When this assumption persists, a structural gap emerges between assessment design and operational context. This gap produces institutional “noise,” a misalignment between technical expertise and lived practice. In that space, assessment can feel imposed rather than collaborative and procedural rather than purposeful. Over time, even thoughtfully designed processes may be experienced as compliance exercises when units cannot see how evidence informs decisions.

What might shift if assessment were framed differently? If assessment functioned less as documentation of past performance and more as a compass for institutional direction, it could guide choices rather than merely record them.

This article introduces Impact Narratives and Intentional Design for Institutional Effectiveness, practitioner-developed, author-created frameworks designed to reposition assessment as a relational, action-oriented, and strategically aligned practice. Although illustrated primarily through administrative and student support contexts, the framework is equally applicable to academic programs seeking to translate learning outcomes into institutional strategy. Grounded in scholarship on assessment culture, institutional effectiveness, and change management (Elkins, 2015; Henning & Roberts, 2024; Hersey et al., 2022), the terminology and integrated structure presented here represent an original synthesis intended to advance dialogue within the field.

Recent scholarship on holistic assessment similarly calls for integrated approaches that connect institutional units and unlock organizational potential (Wells, 2026). Unlike traditional models that emphasize documenting improvement cycles, Impact Narratives offer a practitioner-centered structure that focuses on translation across institutional levels, moving from operational evidence to strategic consequences. The distinction lies not in collecting different data, but in reframing how data circulate and influence governance.

Impact Narratives connect data to decisions and decisions to mission. Rather than focusing solely on outputs, they prioritize articulating impact in ways that resonate across institutional roles and decision levels. Intentional Design for Institutional Effectiveness serves as the philosophical foundation for this work. Drawing on principles of clear assessment practice (Walvoord, 2023), community-based professional development (Biddix et al., 2020), and adaptive leadership (Hersey et al., 2022), it foregrounds empathy, clarity, alignment, and sustainability in assessment systems.

As institutions navigate enrollment volatility, accountability pressures, and heightened demands for transparency, the question is not whether assessment occurs but how it informs improvement and communicates value. By strengthening translation between evidence and action, the frameworks presented here position assessment not as a reporting cycle to complete but as a strategic instrument capable of guiding institutional coherence in complex environments.

Intentional Design for Institutional Effectiveness

Intentional Design for Institutional Effectiveness is a practitioner-developed framework grounded in the operational realities of assessment beyond traditional academic programs. While academic program assessment often follows established models centered on student learning outcomes, assessment in administrative and student support areas unfolds in more varied and less structured environments. This framework positions assessment as integral to institutional performance rather than adjacent to it.

Administrative and student support units, including admissions, financial aid, information technology, student life, and facilities, are central to institutional success. They shape the student experience and influence retention and completion, as documented in student success scholarship (Henning & Roberts, 2024; Tinto, 2012). However, their work is frequently assessed through operational metrics rather than formal learning outcomes.

This distinction introduces significant challenges. Unlike academic student learning outcomes, which may also apply in some co-curricular settings (Duque & Weeks, 2010), assessment in administrative contexts often centers on efficiency, service utilization, stakeholder satisfaction, or unmet need. These measures are essential, yet they often require deliberate interpretation to communicate institutional contribution.

Additional barriers further complicate the work. Institutional politics, competing priorities, limited resources, fragmented data systems, and resistance to change constrain progress (Elkins, 2015). Practitioners are left to navigate ambiguity as they attempt to translate operational activity into institutional significance.

Intentional Design responds by centering clarity, alignment, and usability. It calls for assessment processes that are sufficiently structured to guide action, including defined feedback loops and explicit connections to planning cycles, yet flexible enough to accommodate diverse missions. Through three interrelated practices, it advances this work:

  • Address resistance with empathy. Acknowledging prior experiences with assessment and emphasizing formative improvement builds trust and shifts assessment from evaluation to partnership.
  • Simplify complexity through intentional design. Structured, user-centered processes and clear guidance (Walvoord, 2023), paired with capacity-building opportunities (Biddix et al., 2020), enable units to interpret and use their data strategically.
  • Elevate value through institutional alignment. When assessment findings are explicitly connected to strategic commitments and resource decisions, assessment becomes a driver of institutional action rather than a retrospective account of activity.

In addition, intentional design must incorporate attention to differential impact. Assessment processes should encourage disaggregation of data and explicit examination of how services affect different student populations. When evidence surfaces disparities, assessment moves beyond aggregate reporting and contributes to equity-informed strategy. By translating holistic aspirations into structured practice, Intentional Design advances contemporary calls for integrative institutional assessment (Wells, 2026).

The Power of Impact Narratives: A Foundational Philosophy

An Impact Narrative is more than a report. It is an evidence-informed account that clarifies contribution, guides decision-making, and communicates institutional relevance. Assessing administrative and student support units is fundamental to institutional effectiveness (Henning & Roberts, 2024; Tinto, 2012), yet consistent translation of operational value into institutional insight has been uneven (Elkins, 2015).

Impact Narratives respond through three commitments:

  • Radical empathy. Understanding colleagues’ constraints and priorities fosters partnership and increases engagement.
  • Intentional design. Transparent, usable processes ensure that evidence leads to action.
  • Institutional effectiveness. Aligning unit-level evidence with institutional priorities transforms documentation into institutional intelligence.

Together, these commitments position Impact Narratives as structured translation, moving evidence from local context to institutional consequence.

Crafting the Impact Narrative: A Practitioner’s Playbook

Assessment practitioners function as institutional translators and culture shapers. Through the framing of assessment processes, they influence how evidence circulates and informs strategy.

Embrace Radical Empathy: Uncover the Why

Impact Narratives begin with relational groundwork. Trust increases the likelihood that assessment conversations surface meaningful challenges rather than superficial compliance responses, a dynamic frequently noted in organizational change literature (Hersey et al., 2022).

  • Move beyond the data request. Clarify unit goals before introducing metrics.
  • Acknowledge constraints. Naming resource realities signals partnership rather than oversight.
  • Frame assessment as a solution. Position evidence as a tool for decision-making and resource alignment.
  • Create low-stakes engagement opportunities. Operational concerns, such as reducing student wait times, can serve as the foundation for an Impact Narrative that links service improvement to institutional outcomes.

When empathy anchors engagement, evidence is more likely to shape decisions rather than remain static documentation.

Design for Clarity and Action

Clear design ensures that assessment moves beyond documentation and directly informs decisions. When processes are overly complex or ambiguous, engagement may decline, and evidence may go unused. Intentional design emphasizes accessibility, structure, and actionable outcomes.

  • Simplify and scaffold expectations. Provide structured guidance that supports equitable participation.
  • Develop user-centered tools. Align templates with workflows and demonstrate how existing data can be interpreted strategically.
  • Offer modular training. Focused professional development builds analytic fluency (Biddix et al., 2020).
  • Prioritize action planning. Specific, measurable steps embed accountability and connect evidence directly to institutional decisions.

Connect Local Stories to Institutional Success

Assessment gains influence only when unit-level improvements are explicitly connected to institutional priorities. Practitioners play a critical role in translating unit-level evidence into institution-wide decision insight.

  • Elevate the narrative. Reframe outputs as institutional contributions.
  • Foster collaboration. Structured cross-unit review reveals interdependencies.
  • Strengthen visibility and alignment. Institutional summaries and dashboards clarify how local improvements support strategic goals.

Imagine a registrar’s office identifying prolonged wait times during add/drop periods. Service data and student feedback reveal that delays disproportionately affect first-generation and transfer students. After redesigning appointment scheduling and reallocating staffing during peak weeks, wait times decrease. The resulting Impact Narrative frames the change not only as improved service efficiency but as reduced enrollment disruption and enhanced student persistence. Institutional leadership incorporates peak-period staffing into future planning discussions. What begins as an operational concern becomes a strategic retention intervention.

When local evidence is consistently connected to institutional direction, assessment informs planning, resource allocation, and strategic decisions. This translation work reflects broader movements in assessment scholarship advocating for integrative frameworks that align operational insight with institutional mission (Wells, 2026). When narratives clarify contributions across units, institutions gain not only documentation but visibility into systemic opportunity.

The Practitioner as the Narrative Architect

Assessment practitioners shape how evidence circulates and is interpreted within institutions. Through adaptive leadership (Hersey et al., 2022) and sustained cultivation of assessment culture (Schuh, 2013), they embed assessment within governance.

Practitioners contribute by:

  • Serving as strategic champions. Connecting operational improvement to strategic commitments.
  • Equipping units. Building sustained assessment capability (Biddix et al., 2020; Walvoord, 2023).
  • Modeling evidence-informed practice. Making the connection between data and institutional decisions visible.

When practitioners lead in these ways, assessment becomes embedded in governance rather than relegated to reporting cycles.

Conclusion: The Ongoing Story of Impact

The movement from compliance-driven assessment toward embedded, evidence-informed improvement continues to define institutional assessment work. The grand challenges of innovation, equity, and effective communication (Singer-Freeman & Robinson, 2020) are especially pronounced in administrative and student support contexts.

Impact Narratives provide a practitioner-level mechanism for realizing contemporary calls for holistic and integrated assessment (Wells, 2026). By reframing outputs as contributions and data as direction, practitioners reposition assessment as a strategic instrument rather than a reporting requirement. Because Impact Narratives connect operational evidence to planning cycles and resource allocation, they create conditions under which institutions can better anticipate capacity constraints and strategic risks rather than merely document past performance.

As institutions confront increasing complexity, assessment must illuminate impact, surface inequities, and guide adaptive decision-making. When evidence is translated into shared understanding, assessment becomes not merely a cycle to complete, but a compass guiding institutional direction.

 

References

Biddix, J. P., Collom, G. D., & Roberts, D. M. (2020). Scholarship, professional development, and community of practice in student affairs assessment. College Student Affairs Journal, 38(2), 157–171.

Duque, L. C., & Weeks, J. R. (2010). Towards a model and methodology for assessing student learning outcomes and satisfaction. Quality Assurance in Education, 18(2), 84–105. https://doi.org/10.1108/09684881011035321

Elkins, B. (2015). Looking back and ahead: What we must learn from 30 years of student affairs assessment. New Directions for Student Services, 2015(151), 39–48. https://doi.org/10.1002/ss.20136

Henning, G. W., & Roberts, D. M. (2024). Student affairs assessment: Theory to practice (2nd ed.). Routledge.

Hersey, P., Blanchard, K. H., & Johnson, D. E. (2022). Management of organizational behavior: Leading human resources (12th ed.). Pearson.

Schuh, J. H. (2013). Developing a culture of assessment in student affairs. New Directions for Student Services, 2013(142), 89–98. https://doi.org/10.1002/ss.20052

Singer-Freeman, K. E., & Robinson, C. (2020). Grand challenges in assessment: Collective issues in need of solutions (Occasional Paper No. 47). National Institute for Learning Outcomes Assessment. https://www.learningoutcomesassessment.org/wp-content/uploads/2020/11/GrandChallenges.pdf

Tinto, V. (2012). Completing college: Rethinking institutional action. University of Chicago Press.

Walvoord, B. E. (2023). Assessment clear and simple: A practical guide for institutions, departments, and general education (3rd ed.). Wiley.

Wells, N. (2026). The integrated advantage: Unlocking institutional potential through holistic assessment. Intersection: A Journal at the Intersection of Assessment and Learning. https://doi.org/10.61669/001c.157706