Sunday, February 4, 2018

Unit 2 Reflection: Finding the Assessment Sweet Spot

Much of the materials over the past two weeks challenge the traditional scientific methods for assessing student learning. Trends in education assessment are moving from evidence-based objectivism and more toward student/teacher sharing learning supported by the cognitive model where students learn from meaningful engagement of activities that challenge practical application of learning to solve situational problems.

I see this trend happening also in medical education where medical students are starting clinical training in year 1 of medical school. They are mostly shadowing providers and writing papers at this point, but they are learning by modeling practitioners on how the knowledge they are learning is applied in the practice of medicine.



http://phoenixmed.arizona.edu/education/md-admissions/md-program/curriculum/pre-clerkship-curriculum-years-1-and-2

Accreditation for medical and other programs are now requiring elements of interprofessional education and practice, meaning students are required to have exposure and exhibit competencies on how to work with other health professionals as team-members. The assessments to measure such competencies are up to the individual schools, but from my experience with a few projects a local college uses assessments from the medical preceptor as a summative assessment post-rotation. To me, this misses a significant learning opportunity for the student to be engaged in the assessment process and modify behaviors to meet the said benchmarks that rate competencies.

Sticking to areas of influence I have over the education programs I run,  I am challenged on the movement away from scientific models of assessment as a means to gather generalizable data to report on education trends and effectiveness. As an educator, my primary responsibility is to ensure students are no only learning, but also gaining skills on how to apply knowledge in the work environment. I have to employ a battery of assessments as discussed in the exploratory activities to address multiple needs including:

  • authentic assessment: to ensure students meet measurable standards for knowledge on foundational blocks necessary for selection into job training and then into clinical training
  • dynamic assessment: ensure students master medical documentation skills to be applied in clinical training. Timely, personalized feedback is provided to students for process improvement. This is done in classroom and simulation environments.
  • competency assessment: students must prove ability to perform essential tasks in the clinical environment to be eligible for employment. Students are provided a peer assessment at the end of each training shift and given support for weak areas to improve. Often the knowledge was provided in the theory portion of the education, and told to "freshen up" on materials. 
I would love to explore ways to add more playful assessments as a way to include friendly, competition for students as incentive to learn the dryer aspects of the training, like medical terminology and medications. We have added Blackboard badges and certificates as a summative assessment award for successfully passing certain lessons. But having intermediate challenges along the way would add additional benchmarks for motivation. I need to create a plan and seek approval from leadership, especially if this adds more time for students to an already lengthy training program. 



4 comments:

  1. Great post Jeff! I too would love to incorporate more playful assessments as a fun way to test my students' knowledge. I enjoyed reading about the similarities in medical education you that you have witnessed. It is always interesting to hear from another perspective, thanks for sharing! :)

    ReplyDelete
  2. : Good reflections Jeff. Historically, medical and military education folks have led most of the important innovations in education. Problem-based learning, case-based learning, simulation-based learning, etc. were developed by medical education groups and are underlying your instructional practice. It does make sense because your field has a very clear instructional goal, i.e., training students to become professionals with enough knowledge, attitudes, and behaviors as the way professionals use them. Thus, providing a realistic/authentic learning environment is of critical importance to such professional education. However, I am always questioning about the authenticity of assessment in such professional programs Such questions include: to what extent their assessment practice involves such authenticity; how the results of the assessments used provide the evidence of student competencies in the real world context; and the assessments are really designed to capture to what extent candidates apply their knowledge in a realistic context. I know, many medical programs use simulation to measure students’ real-world competencies like flight simulators of the aviation training programs. In teacher education field, many institutions also develop and use such simulation tools for pre-service teachers' practicum evaluation. Those are good examples to establish the authenticity of the assessment. But there are many other ways to do so. Thus, the big question we need to think of as a instructional/assessment designer is this: how we can provide realistic assessment environments.

    ReplyDelete
  3. Hi @bkworld. I just saw this reply. Your challenge for realistic environments is so true. We created patient/provider interview case studies that were technically accurate, but did not reflect the day to day workflow. Just recently we developed new videos adding nuances that reflect real-life encounters, such as a patient holding a crying baby. Such updates we have found are better preparing the students, so far to date.

    ReplyDelete