Wednesday, May 16, 2012

Now You Argue It

I am writing this blog entry in my hotel room in Covington, Georgia, getting ready for the two-day workshop I will be leading tomorrow and Friday at the Institute for Pedagogy in the Liberal Arts at Oxford College of Emory University. I am grateful to Jeff Galle for inviting me down here and giving me the opportunity to spend two days with a couple of dozen faculty members talking about Mind-Based Teaching and Learning.

I became interested in this topic—how research from cognitive theory can help us better understand the teaching and learning transaction—last summer, as I was doing research for my current book project, and first began writing about it in The Chronicle of Higher Education late last year. My research and fascination with this topic has continued to grow, and I’m looking forward to sharing it with others and inviting them into the conversation.

On the plane ride down here I finished a book that I have been meaning to read for a while now, and finally got around to this past week: Cathy N. Davidson’s Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn (Viking, 2011). Davidson uses the famous video of the basketball players and the "invisible" gorilla to advance the idea that attention blindness—which means that when you focus intently on one thing (such as counting basketball passes), you miss lots of other things (such as a gorilla walking across the court)—is the fundamental structuring principle of the human brain, and provides us with an exciting opportunity to build collaborative knowledge networks. You count the basketball passes, she suggests, and I will watch for gorillas. Together we’ll overcome our individual attention blindness and see the bigger picture.

Davidson seems intent on critiquing the many studies which have been conducted and published on the question of multitasking. The vast majority of those studies suggest that our brains are not very good at multitasking. We tend to perform more effectively at any given task when we focus our attention than when we have multiple tasks we are switching between at any one time.

Nonsense, says Davidson. If we focus on one task, we miss the gorillas. And, she points out in the final pages of the book, the human brain never really concentrates on one task. Close your eyes for five minutes in a dark room and try to concentrate on one thing, and you will see how easily and continually our mind wanders from one thing to the next. Instead of criticizing this habit of mind, and condemning multitasking, she suggests we should embrace multitasking as the modus operandi of the digital age, and figure out how to do it as effectively and collaboratively as possible.

This book, I will now confess, drove me a little crazy. On the one hand, I find her argument an innovative one, and I love the positive attitude she has toward the digital age. She is an excellent writer, and can weave together seamlessly a wide range of nonfiction forms: literary and cultural critique, scientific research and reporting, personal profiles, personal narratives, and more. The book could serve as an excellent primer for aspiring nonfiction writers on how to think creatively about the nonfiction book form.

And she brings to light many innovative thinkers and teachers who are doing outstanding work in the digital age. I was blown away by her portrait of a Danish entrepreneur who has hired members of the Autistic and Asperger’s communities to work on a complex computer task, and has found an excellent fit for these individuals working in an environment conducive to their special intellectual strengths. The book features a dozen or more profiles of such innovative thinkers, and Davidson deserves high commendation for showcasing their work to a broader public.

And yet the book’s overall recommendation seems to me—and to multiple reviewers from the fields associated with cognitive theory—to stem from some fast and loose playing with brain science. Cathy Davidson’s original field, like mine, is English literature, so for me to detail for you the flaws of her brain science research doesn’t strike me as a great idea. I was startled enough by much of what she said, however, which conflicted with most of what I had read in this area, to jump online and read some reviews. Those reviews confirmed my own suspicions, and so I will let the cognitive theorists tell you why Davidson seems to, as one brain science reviewer puts it, cherry pick ideas from the “mall of brain science.” You can read another version of this same critique of the book from Christopher Chabris, one of the creators of the invisible gorilla experiment, in the New York Times.

I will add two critiques to what you can find in those reviews. Davidson argues that education needs to focus more on developing the skills of imagination, collaboration, creativity, communication, and so on, and that the best way to develop those skills is to construct classroom environments that resemble video games. Students who interact and learn skills in complex, multi-player gaming environments, the argument goes, will take and apply those skills into the complex digital age in which they will be living and working after school.

Except, unfortunately, plenty of studies show that one of the most challenging features of learning—another one, like multitasking, that we don’t seem to be very good at—is transferring knowledge and skills from one environment to another. As Susan Ambrose and her colleagues write in How Learning Works, “most research has found that (a) transfer [from one learning context to another] occurs neither often nor automatically, and (b) the more dissimilar the learning and transfer contexts, the less likely successful transfer will occur. In other words, much as we would like them to, students often do not successfully apply relevant skills or knowledge in novel contexts” (108).

So the argument that students will transfer the skills they are learning in World of Warcraft or any other video-game environment into the business world they will be inhabiting in a few years just does not seem to be well-supported by the research on human learning. Davidson gives blithe descriptions of the transfer she imagines will happen from video games to the real world: “game playing makes gamers more able to respond to unexpected stimuli: a new antagonist appearing on your digital visual field, a child running out into the road in front of your car, or a sudden drop in your company’s capitalization” (149). Research on learning and transfer suggests that game playing makes you better at game playing—so yes, it will help you identify a new antagonist in the game. But to imagine that that skill will transfer to a better ability to see a drop in your company’s capitalization is a massive leap, one unsupported by the research on human learning.

Finally, I am concerned that Davidson is engaging in some ladder-pulling here. Davidson earned a Ph.D. in the pre-digital age, has written many books of traditional scholarship, and published her argument in a 300-page book of dense (but always lively and readable) prose. She clearly has the ability to focus and pay attention when she needs to. In fact she notes in her acknowledgments the time she spent out walking in the woods and the “divine month of quiet and beauty” she spent in Italy doing research. The multitasking students to whom these arguments are addressed, it seems to me, are less and less likely to have that kind of training in their background, and hence may end up much less able than Davidson would be to perform the kinds of sustained-attention tasks that still are required for many contemporary jobs.

So what concerns me is that when we (i.e., faculty) tell our students to go ahead and multitask, we are doing so from the perspective of people who spent our careers learning to focus our attention for sustained periods of mental effort. And while we now perhaps see multitasking as an interesting alternative to that sustaining of our attention, I wonder about the extent to which we continue to rely on it ways that may be less and less accessible to our students.

With those reservations in place, I want to conclude by saying that I am glad I read the book, and that I would recommend it to you as well, dear reader.  I will concur with what other reviewers have said: I wish she had simply made the case for a bold new vision of education without trying to layer all of the brain science on there.  I think she does an excellent job of promoting the work of innovative educators and schools, and that would have been good enough for me.

Although I am usually pretty terrible at participating in conversations started by my blog posts or Chronicle columns, I promise to join in any conversation that starts up below. I’d be especially interested to hear from other readers of Davidson’s work, or other researchers in this area.


  1. I had the same reaction to Davidson's book, actually. The impression I had was that she began with the conclusion that technological innovations and other kinds of "disruptions" (although that term is quickly becoming jargon) are good for higher education, and then went out to find the cognitive neuroscience that might back up the assertion. The trouble is, as you say, that she seems to pick and choose the science, and shies away from anything that might contradict her fundamental argument. I appreciate the book as a contribution to the discussion about education today, and (like you) will recommend it to others, but with a definite caveat. Great post!

  2. I think you're right about the "disruptions" as the conceptual heart of the book--and I think you are right that the term is becoming jargon. But as you say, and I agree, I think it provides a good contribution to the discussion, even if I don't agree with much of it.