Alissa Quart offers up a nicely balanced take on the increasing prevalence of neuroscience in the humanities, more specifically in the literary field.

“The forces driving this phenomenon are many. Sure, it’s the result of scientific advancement. It’s also part of an interdisciplinary push into what is broadly termed the digital humanities, and it can be seen as offering an end run around intensifying funding challenges in the humanities. As Columbia University historian Alan Brinkley wrote in 2009,  the historic gulf between funding for science and engineering on the one hand and the humanities on the other is “neither new nor surprising. What is troubling is that the humanities, in fact, are falling farther and farther behind other areas of scholarship.”

Neurohumanities offers a way to tap the popular enthusiasm for science and, in part, gin up more funding for humanities. It may also be a bid to give more authority to disciplines that are more qualitative and thus are construed, in today’s scientized and digitalized world, as less desirable or powerful. Deena Skolnick Weisberg, a Temple University postdoctoral fellow in psychology, wrote a 2008 paper titled “The Seductive Allure of Neuroscience Explanations,” in which she argued that the language of neuroscience affected nonexperts’ judgment, impressing them so much that they became convinced that illogical explanations actually made sense. Similarly, combining neuroscience with, say, the study of art nowadays can seem to offer an instant sheen of credibility.”

This jibes with some of what I’ve written on the topic before. Here’s an excerpt from a Facebook discussion I had with a friend on this issue:

“Like the importation of continental critical theory in the late 60s and early 70s, literary scholars have persistently responded to some perceived lack in their discipline (at least in my opinion they have – others would most certainly disagree) by importing more rigorous/technical/”scientific” methodologies from other fields and disciplines. The current trend, sparked largely by Franco Moretti’s work (_Graphs, Maps, Trees: Abstract Models for A Literary History_), attempts to use techniques developed for Big Data and other computational experiments to tease out new conclusions about literary history (e.g. tracking the crisis of faith post-Darwin through an increasing prevalence of certain terms, mapping the spread of new concepts as they hop linguistic divides and enter new domains, etc.). As with the Big Data projects you are undoubtedly familiar with, there is a lot of work to be done before Big Conclusions can be drawn, or we decide whether the whole endeavour was even worthwhile. Dan Cohen, a digital humanities historian hired by Google to work with them on their Ngram project, spoke at UVA when I was there a few years back. It was exciting stuff. For many people in the room, he was demonstrating techniques that would alleviate enormous amounts of time and effort (in some cases, years of labour). On the other hand, many in the room were instinctively skeptical that this offered something of great value. Many literary scholars have become conditioned to expect occasional forays into their field from people promising to “fill the gap” that is perceived to exist there. This ranges from neurologists, to behavioral psychologists, to evolutionary biologists, to – in this instance – computer scientists. Everyone seems to think they have the cure to ail literary criticism. For what it’s worth, I don’t think there’s a crisis in need of a cure. Literary criticism can stand on its own merits and offers incredible value across a range of activities (a former prof of mine blogged about this here: I think the only thing that can be said to be “lacking” is the force of its own convictions, which is why I don’t apologize for the work I do or look to defend it from those who feel it lacks rigour or gravitas. But I don’t think I need to explain this to you, do I? You seem to instinctively understand the cognitive value of narrative structure. It permeates our entire way of understanding life (we are a storytelling/story consuming creature). I would go so far as to say that a solid semiotic understanding of language is almost as important to a computer programmer as an English professor. But, then, your mileage may vary.”