Print Page | Contact Us | Report Abuse | Sign In | Become a Member of AALHE
Emerging Dialogues in Assessment
Blog Home All Blogs
Search all posts for:   

 

View all (21) posts »
 

Showing What Students Have Learned to Do

Posted By Jamie Wigand, Tuesday, December 6, 2016

By Dr. David Dirlam

In the preceding Emerging Dialogue, Jean Downs raises the transformative possibility of higher education using more real-life authentic learning experiences to “show what students know.” I read her posting right after reading the January issue of Observer – the popular membership magazine of the Association for Psychological Science (APS). The last article in that issue was Activate Active Learning, written by Genevieve Henricks-Lepp, a 5th year Ph.D. student. I grew up in a Dewey school noted for its active learning and carved out active learning programs for myself in college and graduate school. Teaching psychological science, for me, meant doing psychological science. Authentic learning and active learning are mutually enabling.

ShowingWhatStudents_Q1_Authentic and Active Learning

After getting some experience as a teacher of psychological science, students raised research questions every week in every pre-experimental class. They would ultimately use the literature to identify possible answers, design observations to decide between the possibilities, conduct the observations, analyze the results, write a careful description of them, and ultimately communicate the results both to other scientists and to the general public. Henricks-Lepp cites a Bruce and Bishop (2002) article on “using the web to support inquiry-based literacy development.” In the article there is a “cycle of inquiry” that looks much like both my teaching approach and the ubiquitous assessment cycle drawn for accreditation agencies. The nodes in the article’s cycle are ask, investigate, create, discuss, and reflect. In their presentation at the 2015 Southern Association of Colleges and Schools–Commission on Colleges (SACSCOC) annual meeting, Drs. Rodriguez and Frederick from University of North Carolina-Charlotte identified the assessment cycle as 1) identify outcomes, 2) select and design measures, 3) plan for data collection and implement measure, 4) analyze data, and 5) use results to improve student learning.

ShowingWhatStudents_Assessment Cycle_Rodriguez_Frederick

Henricks-Lepp identifies a few techniques for stimulating active learning. Think-pair-share has students reflect on an instructor’s question for a minute and share it with another student for 2 minutes. 3-2-1 has students pick 3 ideas from the day’s class, create 2 examples of them, and pose 1 question about the material. A third activity involves students drawing pictures related to the topic and organizing their group’s drawings into a meaningful collage. My passion for active learning drove my choices to take my students to the Carolinas Psychology Conference or even professional conferences like APS, EPS, and SSPP, be a charter member of APS, and the very first paying member of AALHE, and above all be involved in creating science. But Jean Downs’ question points to two grand flaws in assessment cycles that interfere with such active learning.

The first grand flaw in the (usually annual) assessment cycle is that its timing has almost nothing to do with the activities identified by Henricks-Lepp. No matter how good an annual assessment design is, it is so far removed from such learning activities that assessment would be extremely unlikely to identify their impact. Though active learning is the most enduring sort, annual assessment is bound to fail in its goal of improving the learning process. The solution to this flaw has been identified by Peggy Maki – a real-time assessment model. To impact instruction, instructors must assess student learning while they interact with students.

The second grand flaw in the assessment cycle is that it focuses on individual accomplishments while education needs to benefit a collaborative society. Usually my students had two or more courses with me involving weekly class sessions where they posed their individual research questions to the rest of the class. The active learning techniques described by Henricks-Lepp might have helped my students converge on a good research project more quickly than they did, but for such techniques to be most effective, there must be extensive multi-course follow-up. My students from a tiny college in an impoverished area of southern Appalachia received national recognition for their research productivity. The question-posing classes that inaugurated that productivity worked because the students knew that one of their questions would eventually end up in a year-long project that they would present at a conference. There was not only activity, but also a personal vision that they were proud to share with other students. These visions took us across the region to academic libraries, mental health hospitals, wildlife sanctuaries, wilderness hikes, schools, physician’s offices, and homes. Every student’s project was their own creation and unique to them. They were much more fun than anything I would have made up for them to do.

ShowingWhatStudents_Q2_Annual Assessment Bound to Fail

No project from these classes was allowed if it simply replicated work in the literature or simply collaborated with another student. But this very requirement set up a whole new dynamic in the program. Students helped each other create and implement each other’s visions. The field trips were seldom solo and often involved the whole class. They presented their work to whole classes on numerous occasions at every step of the way and just like professional scientists, they helped each other make improvements. The collective success from individual projects of these students is the solution to the second grand flaw. There was no need to “show what students know.” Every student who was accepted and presented at a regional or national conference achieved an “A” and in my last experimental class at that college, 15 of 16 students achieved an A. Moreover, nobody from the college or an accreditation agency ever questioned whether they had earned it. Their accomplishments showed what they could do.

Such projects are not unique to psychology. I received a Facebook letter from a student who helped found a laboratory nursery school with other students, another professor and myself more than four decades ago. The school is still going and the student had a diverse career in social welfare. Former colleagues at Virginia Wesleyan, Kathy Stolley and Diane Hotaling have collaborated with students to create one of the few on-campus homeless shelters in the country that has been going on every January for nearly a decade. Tom Gattis, now at Columbus College of Art and Design, collaborated with students to design and build a fiberglass family power boat. As soon as students graduate or otherwise get out of higher education, authentic projects are everywhere. If we bring such projects back in, the key assessment question will become “how should we show what students have learned to do?”

Finally, even if we drop the requirement of a unique vision for each student, we can still evaluate through “showing what students have learned to do.” That is exactly what developmental rubrics are designed to accomplish. But that is for another posting.

David K. Dirlam, PhD

The author is the first paying member of AALHE and on the member services committee since its inception. He is currently under contract with Routledge (Taylor & Francis Group) to complete a book by September of this year called Teachers, Learners, Modes of Practice: Theory and Methodology for Identifying Knowledge Development. The book provides usable results from 50 years of work on the topic.

References and Further Reading

Bruce, B. C., & Bishop, A. P. (2002, May). Using the web to support inquiry-based literacy developmentJournal of Adolescent and Adult Literacy, 45(8), 706-714.

Halpern, D. F., Graesser, A., & Hakel, M. (2007). 25 Learning principles to guide pedagogy and the design of learning environments. Washington, DC: Association for Psychological Science. Retrieved from https://louisville.edu/ideastoaction/-/files/featured/halpern/25-principles.pdf

Maki, P. (2016 forthcoming publication). Assessing Your 21st-Century Students in Real-time. Stylus Publishing, LLC.

*Assessment cycle graphic created by J. Frederick and B. Rodriguez for Miami Dade College. Drs. Frederick and Rodriguez are currently affiliated with Broward College.

This post has not been tagged.

Share |
Permalink | Comments (0)
 

Connect With Us

Association for the Assessment of Learning in Higher Education
2901 Richmond Road, Suite 130-318
Lexington, Kentucky, USA 40509
Phone: 859-388-0855
Email: info@aalhe.org

Save
Save