- About AALHE
- Member Resources
- Support AALHE
|EMERGING DIALOGUES IN ASSESSMENT|
Maps and the Search for the Buried Treasure of Assessment
April 18, 2016
Ahoy Me Hearties!
The survey asked three open-ended questions:
Blimey! What the Data Indicate
The survey did not ask for any information about the type or size of the institutions associated with the responses (This is important, as in assessment activities, size may matter. The size of your department and the resources available to department chairs and directors can have a direct impact on the time available to devote to tasks).
The second question addressed purposes, and the responses did not divide cleanly across the results of the first question (in other words, respondents who did not use the tool did respond to the second question, for instance if they had used the tool at another institution). Respondents reported using the tool for a variety of both internal (examining linkages between course and program goals/program review) and external (accreditation purposes).
Example quotes include:
“Accreditation and assessment.”
“Being forced to look at the linkage between course objectives and program goals brought a lot of discussion into what was actually happening with the course to measure the specific objectives thought to be mapped to a program’s goals. The mapping process resulted in some programs adding/changing program goals, some courses adding/changing course objectives, and cases where the assessment plan for course objectives were changed. It can be easy to say a course meets a program goal but it becomes more accurate to have to show how.”
“Program Review mandated by state board every 5 years; departmental revision of course learning objectives to meet General Education Objectives; faculty Curriculum Committee review of new course and program learning objectives.”
The third and final question addressed the actual cost/benefit aspects of one tool vs. another. Here, it appeared there was a divide between those institutions who actually use curriculum architecture, and those who don’t.
Generally, respondents who were not using the tool could see where it might reap benefits, but had concerns about the ultimate purpose or payoff, and the potential workload.
Example quotes include:
“Architectures would need updating with every course and/or program objective/goal change. And if there are no changes to curriculum, it would need review/revision during the program review process. It depends on where the architecture is archived, also. On the web? In a digital folder? Those areas would need updating when the architectures are changed – which would be true for a curriculum map, too. Architectures seem to be one level deeper than the traditional curriculum mapping process. So there will be more work. The question is in #2, what are they used for? Why do you need this information? How will it be used to improve teaching and learning?”
“We do not go to this level of detail, so I can’t speak to it from experience; but thinking about the time it would take to not only garner faculty agreement but also to maintain the architecture, I would question it’s rate of return. “
“Updating architectures would be very time consuming, not sure I would get much buy in from faculty if required to do them.”
Respondents who were using the tool were in general more favorably disposed to the work/benefit balance of the tool, though some did voice concerns about the challenges of keeping such a document updated.
Example quotes include:
“More specific course level changes have to be tracked, but us it is worth the time. However, it is challenging to keep updated.”
“Yes, updating architecture is more frequent, but it’s driven more by departmental faculty initiatives bottom up than by top down mandates and it enables much more flexibility in making changes across the curriculum through college-wide assessment and curriculum committees.”
“Everyone involved in creation of the course objective level mapping had very positive feedback and actions for improvement were taken. Sadly, as we currently have no tool other than Excel for the mapping, the maps have not been maintained. Course objectives are often tweaked and without a tool that is tied to a “master” data source of course objectives, no one has time to maintain the course objectives within the Excel maps. If we had a tool where the mapping was tied to a master data source, so changes in outcomes or goals could prompt the need to review the map, it would be well worth the time to continue this level of mapping. “
What Size is your Ship and Your Crew? Where are you Sailing? And What Provisions on Board?
No X Marks the Spot
It’s at times like this, when I do not find the definitive answer that I try to remember the most important thing: the treasure is not the map. Planning and organizing the process of learning and assessment is immensely important, but the real treasure is in the classroom.
Marie Miknavich is the full-time Director of Academic Assessment at Utica College in Utica, NY, and part-time assessment treasure-seeking pirate. Her previous professional ports-of-call include Director of Institutional Research at Herkimer County Community College and Education Data Analyst at a regional information center handling NYS 3-8 test data and analysis.