About Me

My photo
I'm a radiologist and writing helps me make sense of the world.

"My method is to take the utmost trouble to find the right thing to say, and then to say it with the utmost levity" -George Bernard Shaw

Saturday 8 May 2010

Heterodoxy and Hegemony Part 2

Hegemonic Assumption #1 - More Structure in Education is Better

Although we understand much about how adults learn, we are all unique learners. There are a multitude of educational opportunities and media now available for radiologists to learn or stay up to date. Each of us has preferences in how we actually go about learning. Similarly, the factors that motivate us to learn are complex, tacit and often deeply embedded in unique social and personal contexts. We have different pre-existing knowledge and skills. Some of us learn certain notions or techniques more rapidly than others. Furthermore, we are adaptive learners; we will adopt whatever learning style necessary to achieve knowledge or competency in the fastest possible time.

We also know that good postgraduate education should aim to enrich, stimulate & enthuse. It should be flexible enough to encompass disparate motivations, personalities, learning styles and rates of learning.

However, the more structure there is in education the less flexible it becomes. Hence, few find that a heavily guided or proscribed course suits them. Note that this is both trainees and trainers alike. Thus educational rigidity promotes mediocrity, as it is difficult to excel if shoehorned into an inflexible “one-size-fits-all” structure.

I am not advocating abandoning high educational standards. Quite the reverse. But I am saying that it is the end product of a competent or expert practitioner that is crucial. We should be very clear about the standards we expect to be reached or maintained. And we should assess trainees very carefully to see that they have done so. I am also saying that the method used to arrive at such competencies should not be mandated. Sure, suggest a framework of how one might become competent but recognize that the route taken to get there varies hugely between individuals.

Like the sound of this? Well, like it or not, these are the principles of competency-based training. Radiologists should embrace them to take radiology training into the next century.

Hegemonic Assumption #2 – Counting Cases is Valid

Another assumption is that competency comes with repetition. With a largely motor task, there is some validity in this. Perform a physical task a number of times and you get slicker. For interventional techniques consisting largely of psychomotor tasks, such repetition is important. However, complex non-motor tasks such as the visuo-perceptive skills of a radiologist don’t necessarily follow this.

There is also the concept of getting your “film miles”. This the ill-explained, non-foveal visual expertise that comes after having seen tens of thousands of chest radiographs that enables you to spot left lower collapse in 0.01 seconds. This concept also includes the million and one radiographic factors, normal variants and common abnormalities that become second nature.

So, yes, experience counts for a lot. Without experience, book-based knowledge remains just that. But experience must be varied, broad ranging, of varying difficulty or complexity and reinforced by feedback that comes from opinions of other experts, clinico-pathological correlation or follow-up. This well-established cycle of experience, learning and reflection ensures that the trainee moves linearly and safely from conscious incompetence to conscious competence.

Thus, simple repetition alone is potentially harmful. It induces confidence but basic errors may be perpetuated and there is no guarantee of expertise developing. Conscious incompetence may become unconscious incompetence. Not knowing the boundaries of your expertise is a route to medico-legal hell and damnation.

Much of radiology training focuses on producing a logbook detailing sufficiently high numbers of films reported, procedures performed and so on. It ignores and hence sidelines other sources of learning such as consultant-led teaching, film and web-based archives, textbooks and journals, courses and conferences.

Why does training focus on numbers so much? I like to introduce you to the McNamara fallacy. This can be described as below.

'The first step is to measure whatever can easily be measured. This is OK as far as it goes. The second step is to disregard that which can't be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can't be measured easily really isn't important. This is blindness. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide”

So, counting numbers is easy but crude as it is only one aspect of radiology training. It lacks validity as it ignores the quality of the experience, doesn’t assure competence and ignores other methods of learning. Some have said that they find “educational bean-counting” demeans professional training.

However, I am much, much more worried about the Law of Educational Cause & Effect. This states:-

“For every evaluative action, there is an equal (or greater) (and sometimes opposite) educational reaction”

Sometimes called consequential validity, it explains the occasional bizarre and/or maladaptive behaviour that trainees exhibit in the run-up to some form of assessment. Phrased metaphorically, “the assessment tail wags the curriculum dog” or more crudely, “grab the students by the tests and their hearts and minds will follow”.

Radiology training becomes a primary quest for achieving logbook numbers. The totality of experience is short-circuited by this lust for padding out the columns. Less scrupulous trainees guestimate their numbers, leading to some pretty inaccurate figures. Some frankly dishonest trainees may “join the dots”, implying they have done a lot more than they really have. On the contrary, scrupulous trainees who have recorded everything in meticulous detail may have logbooks that look a bit patchy. Yet, independently verifying logbook numbers is virtually impossible. So, as an assessment method, it is not only crude and educationally invalid but is easily fooled and therefore neither sensitive nor specific.

What next then?

What is required in radiology now is bravery to question orthodox modern radiological training and its hegemonic assumptions. Be warned that the medico-political climate is against deregulation of any form. A degree of regulation is sensible and prudent but we must recognize and fight professionally demeaning regulation. First though, we need to get our own house in order. The supremacy of logbook numbers has had its day. Witness the dawn of the era of competency-based training; let us welcome it to our hearts.

1 comment:

  1. Hi, Paul. Have just read your post about the invalidity of "counting cases". I love the phrase:
    “grab the students by the tests and their hearts and minds will follow”.
    I just want to say that this applies to the whole education system in the UK as far as I can see. Thank you for providing such good food for thought. Nan P.

    ReplyDelete