Print Page | Report Abuse | Sign In | Become a Member of AALHE
Share |

Maps and the Search for the Buried Treasure of Assessment

April 18, 2016

Marie Miknavich

Ahoy Me Hearties!
In some respects, working in the assessment field is like a form of piracy: an endless search for a mythical “buried treasure”. If we could just find the right tool, the right approach, the right formula, then our assessment life would become instantly better and easier to manage, and all coherence/alignment problems would fix themselves. A good example of this is a discussion emerging inside my own institution regarding the costs and benefits of using curriculum maps vs. curriculum architectures. A curriculum architecture is simply a curriculum map that also lists course objectives for mapped courses so that those may be associated with specific programmatic goals. Our thinking at Utica College is that if we increase the specificity of our curriculum mapping, we’ll find the “buried treasure,” allowing academic programs to instantly see “wholes” and “holes” in their programs and address them accordingly. Since this is new ground for us, we gathered perspectives from outside our institution by inviting the members of the AALHE ASSESS listserv to participate in a short survey about the respective values of these tools.

The survey asked three open-ended questions:

  • Do you use “curriculum architectures” (like a curriculum map, but also listing course objectives under each mapped course)?
  • If you do use curriculum architectures, for what specific purposes (beyond that of a basic curriculum map) do you use them for?
  • Compared to maps, is the frequency/work of updating architectures a lot more? If yes, are the benefits worth the time costs?

Blimey! What the Data Indicate
A total of 40 members responded. The majority of respondents (65%) indicated they did not use curriculum architectures for any purpose. A total of 35% responded that they had used the tool in some capacity.


The survey did not ask for any information about the type or size of the institutions associated with the responses (This is important, as in assessment activities, size may matter. The size of your department and the resources available to department chairs and directors can have a direct impact on the time available to devote to tasks).

The second question addressed purposes, and the responses did not divide cleanly across the results of the first question (in other words, respondents who did not use the tool did respond to the second question, for instance if they had used the tool at another institution). Respondents reported using the tool for a variety of both internal (examining linkages between course and program goals/program review) and external (accreditation purposes).

Example quotes include:

“Accreditation and assessment.”

“Being forced to look at the linkage between course objectives and program goals brought a lot of discussion into what was actually happening with the course to measure the specific objectives thought to be mapped to a program’s goals. The mapping process resulted in some programs adding/changing program goals, some courses adding/changing course objectives, and cases where the assessment plan for course objectives were changed. It can be easy to say a course meets a program goal but it becomes more accurate to have to show how.”

“Program Review mandated by state board every 5 years; departmental revision of course learning objectives to meet General Education Objectives; faculty Curriculum Committee review of new course and program learning objectives.”

The third and final question addressed the actual cost/benefit aspects of one tool vs. another. Here, it appeared there was a divide between those institutions who actually use curriculum architecture, and those who don’t.

Generally, respondents who were not using the tool could see where it might reap benefits, but had concerns about the ultimate purpose or payoff, and the potential workload.

Example quotes include:

“Architectures would need updating with every course and/or program objective/goal change. And if there are no changes to curriculum, it would need review/revision during the program review process. It depends on where the architecture is archived, also. On the web? In a digital folder? Those areas would need updating when the architectures are changed – which would be true for a curriculum map, too. Architectures seem to be one level deeper than the traditional curriculum mapping process. So there will be more work. The question is in #2, what are they used for? Why do you need this information? How will it be used to improve teaching and learning?”

“We do not go to this level of detail, so I can’t speak to it from experience; but thinking about the time it would take to not only garner faculty agreement but also to maintain the architecture, I would question it’s rate of return. “

“Updating architectures would be very time consuming, not sure I would get much buy in from faculty if required to do them.”

Respondents who were using the tool were in general more favorably disposed to the work/benefit balance of the tool, though some did voice concerns about the challenges of keeping such a document updated.

Example quotes include:

“More specific course level changes have to be tracked, but us it is worth the time. However, it is challenging to keep updated.”

“Yes, updating architecture is more frequent, but it’s driven more by departmental faculty initiatives bottom up than by top down mandates and it enables much more flexibility in making changes across the curriculum through college-wide assessment and curriculum committees.”

“Everyone involved in creation of the course objective level mapping had very positive feedback and actions for improvement were taken. Sadly, as we currently have no tool other than Excel for the mapping, the maps have not been maintained. Course objectives are often tweaked and without a tool that is tied to a “master” data source of course objectives, no one has time to maintain the course objectives within the Excel maps. If we had a tool where the mapping was tied to a master data source, so changes in outcomes or goals could prompt the need to review the map, it would be well worth the time to continue this level of mapping. “

What Size is your Ship and Your Crew? Where are you Sailing? And What Provisions on Board?
 In reviewing the comments as a whole, they raise questions that are endemic to the field of assessment, and the challenges of doing assessment well. These include:

  • What is your real purpose in using this tool? Is it just an exercise on paper or do you have a real plan for using the architecture to improve teaching and learning?
  • Are there appropriate resources available to maintain this tool so that it is useful to those who actually do the teaching?
  • Do you have technical solutions in place that can help foster the upkeep and use?
  • Do you have faculty buy-in and are faculty ready for this step?

No X Marks the Spot
While curriculum architectures do appear to have the potential to be useful, once again, the buried treasure remains elusive, difficult to find, and is affected by the unique mix of intent, resources, technology, and faculty at any given institution.

It’s at times like this, when I do not find the definitive answer that I try to remember the most important thing: the treasure is not the map. Planning and organizing the process of learning and assessment is immensely important, but the real treasure is in the classroom.

Marie Miknavich is the full-time Director of Academic Assessment at Utica College in Utica, NY, and part-time assessment treasure-seeking pirate. Her previous professional ports-of-call include Director of Institutional Research at Herkimer County Community College and Education Data Analyst at a regional information center handling NYS 3-8 test data and analysis.

Return to Emerging Dialogues summary page

Connect With Us

Association for the Assessment of Learning in Higher Education
60 Terra Cotta Ave. 
Suite B #307
Crystal Lake, IL 60014 

Phone: 859-388-0855