Thursday, 14 August 2014

Disgusting Distaste

What! A blog! Never!

Okay I know it's been a while but it's not you, it's me, I just needed some space and now I'm here to beg forgiveness. Now that's out of the way, let's crack on.

My research, simply put, is into the moment that a passion underwent a mutation. When disgust came into being from its parents - abomination,  aversion, and horror.  It wasn't so much a new idea or concept, as much as the creation of a separate category from within an earlier concept.

Let me clarify. Abomination and aversion were wide passions. They covered everything that was the opposite of desire. The need to avoid rotting meat lay alongside the need to avoid a crack in the pavement in the emotional lexicon of the time. Both, it seems, we're understood as part of the same passion. What I am working out is how and why the first part, the icky rotting meat avoidance part, got its own lexicon around the mid-eighteenth century and became the premier league passion disgust, while abomination, aversion, horror etc. were relegated to the championship.

Interestingly, a similar movement may now be taking place. Ever since John Florio introduced the English to the word 'disgust' in 1597,  it has been associated with 'distaste', although the word distaste is not much older, part of the canon of 'inkhorn terms' introduced into an English struggling to adequately express itself. Nevertheless, 'distast[e]' remained the staple definition of 'disgust'  in dictionaries for around one hundred and twenty years, alongside 'dislike'. Seemingly, the dislike part became linked to the distaste part, creating a useful word that related to abominations of taste.

This idea of the orally horrible is still found in modern ideas of disgust: Paul Rozin and his colleagues claim 'oral fixation' as part of a universal 'core disgust'. The problem is, this oral component does not have to be present. Moral and visual disgust often exits outside this masticatory framework. Many thinkers in the field, from disgust pioneer Aurel Kolnai, to the oft-cited William Ian Miller, see other senses as primary to disgust, claiming that the oral fixation is more a hangover of the word's etymology and initial use in aesthetic theories of taste.

It appears that others are starting to agree with the anti-mouth heretics. I have noticed the increasing use of the word 'distaste' as an emotion separate to 'disgust' in the psychological literature. The difference is often spelled out explicitly, with distaste taking the mantle of the orally disgusting, and disgust covering the rest. I can't help but wonder: 'is another mutation taking place?'

Science, and so perhaps the culture it is situated within, appears to be starting to differentiate a subset of disgust into it's own entity,  and this is exciting. Not only because I may be able to examine a processes I am studying in the past happening in real time, but because it asks all sorts of questions of the assumed universality of disgust. Is distaste universal but disgust not? Are both? Are neither? Is distaste an emotion as some maintain, or simply a physical sensation like pain? If so, what is disgust? Is it feelings of horror, abomination, and aversion that can be brought about by distaste-like sensations? Were the early moderns right all along? What does this mean for the study of phobias etc.?

To answer these questions, I may need to read shiny new papers just as closely as dusty old books and archive documents, once again showing how the understanding of lost emotions can, sometimes, be assisted by understanding the found.

I,  for one,  think it's rather exciting. But then, I'm a disgust nerd.

Saturday, 22 February 2014

Renaissance 2.0: Digitising the Pursuit of History

I’ve just been reading Timothy J Riess’s book Knowledge, Discovery and Imagination in Early Modern Europe: the Rise of Aesthetic Rationalism (CUP 1997) as I’m currently delving into a the murky world of whatever passed for aesthetic thinking prior to the eighteenth century. This is an interesting book, suggesting that one of the reasons for the big intellectual and social changes that took place in the Early Modern Period was due to a change in focus from the Bachelors’ studies of the liberal arts of the Trivium (grammar, logic, and rhetoric/dialectic) to those of the Masters’ studies of the Quadrivium (arithmetic, geometry, music, and astronomy). He notes that a new skill began to be applied out of these latter studies that would become central to them: mathematics. Riess focuses on the 1520s, claiming that a majority of university education before this point prioritised grammatica as the foundation of the others, placing language on a lofty pedestal from which the word was with everything and was everything.  The cosmos and the word (and so God) were one and the same. This obsession with the word played no small part in the rise of Humanism, fuelled as they were by a desire to find the perfect word, the most pure Latin, the closest translations, the oldest manuscripts and, soon after, the best use of vernaculars, in order to better read the books of God (bible and nature).

The problem began when language reached its limits. Language may be the way in which a culture understands its cosmos, but it also has constraints built into the language itself. Anything it cannot describe it, well, cannot describe. What was called res, a Latin word meaning ‘stuff’ or ‘things’ but also understood as ‘non-linguistic entities’ lay beyond the edges of grammatica. As a result, language became increasing complex and convoluted in its attempts to grapple with the big questions until, as Riess puts it “older theological and ontological knots combined with sociopolitical and epistemological ones to create an impasse” [Riess, p.2). This weighted down the pursuit of knowledge in the Trivium, while the introduction of new forms of mathematics into the Quadrivium enthused them and those practical arts, such as medicine and natural philosophy, which were studied after them. 

Initially, Petrus Ramus used method: a form of mathematics that he and the Ramist movement, so influential on later thinkers such as Descartes, deployed to restructure the pursuit of wisdom into organisational trees. Ramus’s contribution was one of methodological organisation rather than discovery. As natural philosophy became more prominent in the Quadrivium, and the grammatical bases ability to use natural language as a means of discovery fell away, mathematics became the new means of discovering knowledge. What is important is that this was not simply a rational or reasoned use of mathematics, but one that could explain music through theory, poetry through meter, and even understandings of related passions and actions (which is where I get more interested).

The idea that mathematics was the final cutting of the Gordian knot of discovery is one of a series of suggestion as to why the Aristotelian Organum had begun to be rejected in favour of new ideas. Some suggested a deep change in mentality from a world focused on hearing and understanding to one focused on sight and discovery. Others suggest that the shock of discovering both the New World and that your head didn’t explode from the heat if you kept going south, followed by the discoveries of  Copernicus, Kepler, and Galileo, began a process of questioning other assumed ideas; a realisation that quite often the ancients were wrong. All of these things, coupled with economics, environmental and population changes and the resultant geopolitics are, I suspect, part of it. Not only is it foolish to assume a single cause, in my opinion, but also we just don’t have the information. Yet.

But this is not what I am interested in. I am interested in the idea that natural language reached an impasse. That a time came when no more discoveries could be made by word alone and it turned out that the word was not God. Works became convoluted and over complicated, and the description of the cosmos as text was almost impossible. No matter how perfect the Latin, or vernacular, there remained the undiscovered raw signified.

I am getting a little ahead of myself.

One of the interesting things about history is that it is not a grand narrative, or a teleological series of causes and effects marching on in progression to now. There are continuities, sure, but why things might continue is as curious as why things discontinue, and both of these involve overlap, loss, discovery, rediscovery, and reconstruction. If history gives the illusion of teleology, it is because often comparisons between now and then, or between discontinuities and continuities, provide us with a tool to examine history in some detail. Then I wondered, what if something like a reconstruction or rediscovery is happening now. 

TEXTUAL HEALINGS

Without going too far into the tedious and complex (oh so tedious and oh so complex) details of poststructuralism, the very basic premise is this.

Everything we describe we do so has a referent (the ‘thing’), a signifier (the label or word used to describe the ‘thing’), and a signified (the concept we attach to the thing). Take a door. That wooden thing behind me (referent) is called a "door" (signifier) and is conceptually something that I use to get in and out of my room (signified). The signifier and signified come together to form a ‘sign’, because without both, you just have an extra-linguistic referent that cannot be understood. A word without a concept or meaning is just noise, a concept without a signifier is something unknown. 

Signifiers need not be words. Road signs are a classic example of a signifier that is not a word. It is still communicating something, but not through a natural language. Mathematics could also be included as signs. Indeed, maths is the harshest type. An incorrect equation is almost by definition a signifier without a signified.

Here is the problem identified by poststructuralists, as simply as I can put it. In order to describe the signified – in the case of the door that is ‘something that I use to get in and out of my room’ – you need another set of signifiers. ‘Something' can be signified as ‘an object’, ‘I use’ as ‘an action by the self for practical purposes’, ‘out’ as ‘not from within’, and so on. This forms another set of signs, all of which have their own signifiers and signifieds and on it goes. It ultimately all becomes text; how else do you describe the signified of a road sign? 

According to Derrida, you can take these layers of signifiers, deconstruct them, and then reconstruct a different but just as valid sign. So the signified of a door could be reconstructed as  ‘an action utilized by the self that is not from within an object’, altering the overall sign. This is a very, and I mean very, simple example of reconstruction (but at least you can see where the indecipherable language of poststructuralists comes from!).

One of the most pressing problems with poststructuralism is that they really should not write anything at all, because if there is no ultimate signified, there is no foundation to any sort of knowledge. Epistemology is just shifting sands on which nothing is real, and everything is subjective, as with postmodernism. Oddly, this postmodernism and poststructuralism that has buried deep into the skull of social science and the humanities seems to have bypassed physical science. This may be because the physical sciences have already faced this crisis way back in the 16th and 17th Centuries, when natural philosophy realized the shortcomings of natural languages for many of the same reasons. It may also be because poststructuralism’s problems do not apply in the same way to mathematics, because you can unpack equations into their foundational elements. This can even be done if the mathematical signifiers lack a signified.

The problem is not that everything is text, but that some things are not. Extreme postmodernism may be right: it seems that you cannot discover real things about the complexities of human social and cultural interaction, in the past or the present, using natural language alone.  There will be an impasse where many of the ontological, epistemological, and sociocultural and political – not to mention psychological, ecological, and economic  - knots of human behaviour become wrapped up in a language of non-discovery.

RENAISSANCE 2.0

The social sciences and humanities need a Renaissance. Mathematics has, thus far, not been all that fruitful to the pursuit of history. Richard Carrier has attempted to use Bayes Theorem in order to study history, but his laudable ambition falls short when you realise that the figures to be plugged into his calculations are little more than educated guess work, prone as that is to subjective influences. However, there is a new tool at the disposal of humanities and social sciences that might bring hope: computers.


From a slideshow on the Digital Humanities
available by clicking here
.
The Digital Humanities (and Social Sciences) open up opportunities to examine patterns free from the constraints of natural languages. It allows huge amounts of texts and other data to be collated, dissected and compared. It allows for the possibility of context free patterns to be merged within a series of contexts such as that of the work, the author, their background, the readership, and the person to whom the work is responding. It allows us to move past the unending raft of social theories that seem no better to me than the often discounted observations of early modern philosophers, and to examine the data inductively for hypothesis forming patterns, and then test that hypothesis, possibly through Bayes Theorem and/or by making a prediction about future data.

Now, I know this will freak out a lot of social sciences and humanities people out, but it need not. I think that the tools now at our disposal take us past the examination of at most a few hundreds of sources and potentially into the examination of millions. Combined with the very best of current methodology, including a close look at what we find out to be the more representative, and least representative, texts, this can only be a good thing. It means we have to be ready to adapt. It means we have to ask questions and not assume answers, and it means, in the case of history, that those answers are more likely to come from the past, than the present.


I actually think it is not only good, but also very probably the future.

Wednesday, 19 February 2014

A Wellcome Guest.

For my latest blog, I teamed up with the fine folk at theWellcome Collection to talk about their latest Twitter #Curious Conversation on irrational fears.

It's a tiny bit disgusting. Read it here.

Thursday, 13 February 2014

Awkward: A call to arms for a found emotion.

Recently, the word awkward has started dominating our media worlds. Adverts, programmes, news articles, and novels have all given space to this word. There's a movie about it, songs about it, Facebook, tumbler and reddit pages dedicated to it, and much more. But what is it?

I asked this question on my Facebook page a while ago and a friend of mine and fellow ex-Cambridge inmate - Dawn Jackson Williams (her blog can be found here)  - suggested that this might be a new emotion, perhaps even a 'found' emotion. So I thought I'd give it a closer look.

Could Awkward be an emotion?

Let’s start by asking if it actually is an emotion by in turn asking ‘what is an emotion anyway?’ I'll have a look at the research and find a universally defined definition of emotion by which we can evaluate it.

Well, this is awkward. There doesn't seem to be one.

Richard Shweder's suggestion that it is 'an essentially contested concept' is pretty much on the money. It could be a neural impulse that leads to survival behaviour, or sexual behaviour.  Neuroscience says it's just types of pleasant on unpleasant sensation centred around the limbic system; a neurochemical 'step-up' of the lizard brain in our mammalian brain. Paul Ekman has given us seven characteristics of emotions while some say emotions are sociocultural constructions, some that they are linguistic constructions, and some that they are a feed back loop to alert people to their state in the world. The bottom line is there is little that researchers agree on other than these things are feelings that we have. They then get more confused when concepts like affect and mood and sensation and passion get added to the mix. It's all a bit of a mess, really. 

So how can we unknot this coital cluttering? Let’s try a few definitions and see if it fits.

Awkward as Ekman’s characteristics

1. Automatical appraisal: we realise we are feeling awkward without prompt, so that’s okay.
2. Commonalities in antecedent events: only certain things can cause the sensation of ’awkward’.
3. Presence in other Primates: Hard to say; the usual response is ‘needs testing’, so I’ll go with that.
4. Quick onset: pretty much instantaneous, I’d say.
5. Brief Duration: well, it doesn’t linger.
6. Unbidden occurrence: yep, it happens suddenly  alright.
7. Distinctive physiology: now this one is interesting, I’ll come back to it later.


I’d say it passes the Ekman test. I’d also say it is a sociocultural construct, and it is extremely linguistic in nature and while any evolutionary benefit is difficult to find; I’m sure a canny evolutionary psychologist out there can think of one. I would, then, say it is an emotion.

What is Awkward, then?

Linda Davidoff says it’s all about physiology: heartbeats, facial expressions, crying, shaking and so on. Like I mentioned above, Awkward has its physiological element. It seems to involve sweating, feeling uncomfortable with that bit of a squirm. When used in humour we see the pupils dilate of the person who triggers the emotion, the palms become sweaty, all those involved desperately trying to avoid each others’ gaze, and a sudden aversion to physical contact. It also has elements similar to fear; a flight or freeze mechanism (sometimes fight?) kicks in to all who experience it, and it HAS to be a group experience, even if the cause is just one person. It is an inherently social emotion.

This a social emotion as it is never felt by just one person, even though it is strongest in the person who causes it. If it happens when you are alone, it has to be shared or it remains simple embarrassment. It may be brought about, as Dawn suggested, by a social networking generation, unsure of the boundaries of interpersonal contact; triggered when someone is thought to over step the line. Sort of a mass version of short-term Asperger’s Syndrome when those only used to online interaction misread or simply misunderstand the boundaries.  As soon as those boundaries have been crossed, the emotion is felt, first by those around the person who crosses the barrier and soon after by that person, with a little more intensity. The negative aspects seem to die down quite quickly, as the amusement takes over; this is a reason it is not embarrassment. Firstly, you can be embarrassed about something no one else notices. This isn't the case of awkward. And secondly, to be embarrassed is rarely to be part of a funny moment. Sometimes hours, months, years, or even decades can pass, and remembering the embarrassing thing you did will still cause you to take a deep breath, and bite down on your knuckle (come on, we’ve all got one of those memories!). Awkward, however, seems to become funny almost straight away and remain funny thereafter. Nine times out of ten anyway.

It appears, therefore, to be a mix of group embarrassment and humour, but it is also tinged with and feelings of both belongingness and separation. The belongingness comes from the communal nature of that ‘awkward moment’, shared by those in it, and the loneliness is found in the individual who feels the awkwardness. As it becomes funny, it draws the individual and the group together, ultimately acting as a source of in-group bonding; some may even suggest a right of passage.

And that’s all I have so far, and that is somewhat phenomenological.


Towards an Emotion of Belonging
 
So what, you may ask? Well, Ute Frevert in Emotions, Lost and Found  (Central European University Press, 2011) suggested the idea of Lost and Found emotions, emotions that are lost from cultural view ether in intensity or altogether – such as ‘honour’, and some that change into another emotion with some shared characteristics but ultimately quite different specifics – such as the movement from ‘acedia’ to ‘melancholia’ to ‘depression’, or from ‘shell shock’ to “PTSD’. Awkward is a ‘found’ emotion, a new emotion, brought about by changes in the way people live and, as Monty Python said in the Life of Brian: ‘Reg, it’s perfectly simple, all you’ve got to do is go out of that door now ... It's happening, Reg! Something's actually happening, Reg! Can't you understand?!’

This is a call to arms to Anthropologists, Sociologists, Psychologists, Media Theorists, Modern Historians, those involved in cultural studies, and any other interested parties. Right now, there is the opportunity to study the birth of what may well be a brand new emotion. What this can tell us about emotions, across the disciplines, may be huge, or may be little, but it will certainly be interesting.

Lets do this, someone. Lets grasp the nettle, hook the fish from the barrel, and strap it across whatever strappy clichéd metaphor you can think of.

Or would that just be a little too awkward? 

Tuesday, 11 February 2014

Problematising Problematisation, or, Blunting Occam's Razor with a magnet.

[This was written on train, as such, it's a ramble]

This may be controversial. That it may be so is a testament to the inability if the academé to properly treat the rash of Foucaulian  Postmodernism that infected parts of it in the 90s. By now we should have a logic-cream or something.

Or at least according to my current epistimé anyway.

Nuance is not invention

I'm for nuance. I'm all for observing the complicated nature of things. But I'm also a big fan of Occam's razor: don't over complicate things unnecessarily. Whenever I see that awful word 'problemitise' my teeth hurt and my spidey-senses tingle. So I thought I'd give it a taste of its own medicine. 

Let's begin with a question: what does 'to problematise' mean. Well, what it should mean, in my opinion, is to leave no stone unturned and make sure we have not missed anything that might have a barring on the outcome of our investigation. Sadly,what it tends to mean is a post-structural, postmodern notion of 'let's see if we can stretch definitions to breaking point and then, once we have injected enough poison to murder the subject, let's claim the subject was sick all along.' 


Problematic scientific problems

Let me think of an example. How about certain theorists' rather odd opinions of science? 

This is a paraphrased amalgamation of a few things I have read 'problematising' science. Most of them said all this. Honestly, they did [if you want sources, let me know, but I'm on the train right now!].

'Science's they say 'is a white, middle-class, bourgeoisie, European construction, that through its use of transcendental deductive constructs bypasses materiality to make disembodied knowledge claims of absolute truth that are inherently masculine, in their cold emotionless logic tied only to reason.' They do this, of course, using a computer and publish on the internet and fail to see the irony, or even defend the irony, of that. 

I wish I was making it up.

Time to problematise the problemetisation.

White, middle-class, male: in the west, sadly, but this is not inherent in the process, but part of a historical problem that fantastic women scientists, like Athene Donald, as well as black and working class scientists are trying to address. Science isn't the way it is because the protagonists are mostly white middle-class men, rather science is dominated by white middle-class men for the same reasons much of the rest of the western world is, and this must end.

Does this mean scientists like Athene Donald are masculanised by the process of science; well I suspect she would have something to say about that.  Where does this idea that to be cold, logical and rational is to be masculine while to be otherwise is feminine anyway? Well, from the same patriarchy that people which describe science ad masculine seek, rightly, to destroy. Shall we not pick up and use gender stereotypes for our own ends when it suits, people?

Is science cold and logical? Mostly. Why? Because it endeavours towards the goal of objectivity. To add supposedly feminine 'warmth' (I'm surprised they don't include 'dry'  and 'wet' in their description of masculinity and femininity. After all, what's the 21st century without some humoural medical theory?) would reduce the chances of objectivity still further. It is not cold and logical because it is masculine, but because it had to be so.

Transcendental deductive constructs that bypass materiality (again, this is a paraphrase of real.arguments): well the constructs, such as maths, are no more immaterial than any other language. Their job is to describe reality, or to invent possible realities. The difference between mathematics and most other languages is that it is bound by very strict rules regarding what is permissible and what is not. ‘But what about Frege and logical calculus?' you ask. Well, I would argue that this view of standard language is wrong. Language is not just a set of rules, it is also an action, something you do. Its logic can be very fuzzy at times. The language used in science is more tightly bound to ruled and is less.susceptible to such fuzziness. Nevertheless it is predicated on descriptions of the material world, and even the most ingenious and elegant theoretical physics remains hypothetical until we find a way to.observe it; the Higgs Bosom is an example of this. Science is far from immaterial.

How about disembodied knowledge claims?

Well, it is true that science attempts to see the world from above. It tries to separate its subject from the self in the pursuit of objectivity. But isn't this a contradiction of the claim it is masculine and white? You betcha! I don't see how it is possible to be disembodied AND masculine and white. Nevertheless, some try to make that claim.

In short, there is a problem with this problematisation.

How do I view science? Being outside the dominant Foucaulian paradigm within philosophy of science and much if the social sciences, I see it as not a philosophy but a tool. It's not perfect, a little bent at the corners, but it gets the job of observing and understanding nature better than anything before. It's a combination of induction and deduction that, through falsification, become a self correcting mechanism: Kuhn's paradigm shifts happen not because of cultural change, but because the level of falsification of an old idea becomes great enough to dismiss it. More than that, it corrects itself at the methodological level and assisting that self-correction is the job of good philosophers of science. Not generating mumbo jumbo about disembodiment or strawmen in which all scientists are still positivists. And male. And European.

And that, would you believe, is a controversial opinion within the academy. It is also as complex as I think it needs to be. All the rest just blunts the pudding and over-eggs the razor. Nuance is important, but it must be based in reality, not in some rationally contrived situation in which people sit by the fire like Descartes with a ball of wax proclaiming 'Je pense façon complexe, donc, je suis!'

Historical Problems

History can be guilty of such terrible bouts of probelmatisiation. I see the pursuit of history as occupying a unique place within academic endeavour. It is both a humanity and a social science, and it is neither. It, and its sister archaeology, can happliy borrow from the physical sciences when needed - tree ring data to find out about past climates, radiometric data to discover the ages of objects, x-raying paintings to discover their construction and so on. It can borrow techniques and approaches from a variety of intellectual disciplines; psychology, sociology, anthropology, linguistics, even biology. In order for us to truely know ourselves we have to know our history, but in order to do that, we have to use every tool at our disposal. And yes, I can hear Sir Geoffrey Elton turingin in his grave, donm;t worry, I'm sending  Marc Bloch to sort him out; he didn't turn and run at the first sign of trouble; a proper hero. But once more, I digress.

History has so many tools at its disposal for discovering the past to a reasonable degree of accuracy, or at least one thread of the complex and nuanced beast that is history, that we don't need to add to it with wild speculation. Far too often I encounter historical papers that are big on 'look, I read Goffman, Derrida, and Judith Butler' and short on actually telling me what has actually been deduced from the sources. This doesn't make me an anti-theorist, far from it, but I am an anti-pet-theorist. I am somewhat one who thinks that a combination of induction - look at the sources - and say what you think you see - and deduction - test if what you think you see holds up with more source analysis, and if what you see fits a theory then great, there may be something in that theory. If it doesn't, take a deep breath, maybe even stick your bottom lip out for a time, and then suck it up and think again.

Please, whatever you do, don't use the magnet of problematition so that the data fits the theoretical dreams of your academic heroes. Look through iron filings that fall by the wayside, for in those lie a real source of nuance.

Monday, 10 February 2014

Neuroplasticity through the looking glass (Part 2: and Emotions)

In the last post, I mentioned Guy Deutscher’s Through the Language Glass, which suggested that elements of language has a direct affect on the way we think about the world; not in the strict, untranslatable conceptually schemes of Whorf-Napir hypothesis style linguistic relativism, but in what he called the Boas-Jakobson principle: worldviews are formed by the way our native language forces us to think by its construction, so if you do not have gendered language, for example, it is easier to be ambiguous about genders when speaking or reading. This post will move on from this idea and look at the second book I read recently, Bruce E . Wrexler’s Brain and Culture: Neurobiology, Ideology, and Social Change (MIT Press, 2008), then I’ll try to tie the two ideas together.

Brain and Culture, turns around a simple premise: the human brain is plastic, able to mold itself not only around evolved innate patterns and DNA, but also through external input. This is particularly the case during childhood and adolescence, as it is in this period that our neuronal structures are particularly plastic, and so the external influences of our youths shape the ways we view the world for the rest of our lives. Each generation has a different developmental experience, as influences come not only from parenting but also beyond –  extended family, friends and so on. Each generation then attempts to shape the environment based upon the cosmology they develop during their youth. As Wrexler explains, it is the ‘ability to shape the environment that in turn shapes our brains that has allowed human adaptability and capability to develop at a much faster rate than is possible through alteration of the genetic code itself’ [p.3-4]. In short, brains have evolved to be shaped by society and culture, which with certain changes from the previous generation causes sociocultural changes, which then effect the brains of the next generation and so on. It’s a biocultural approach to the brain and cultural differences; a material explanation for what cultural psychologists call ‘mutual constitution’: the feedback and influence on brain and behavior caused instantaneously and equally by psychology, society, and culture.

So how does it work.

Well, firstly, as children, our neuronal structures appear to be plastic – not in the sense that they are made from oil and you can wrap a sandwich in them but in the sense that they can be molded and remolded. In the adult brain, the plasticity continues but slows down; this is why at Latin class I still bang my head on the table over simple words like mox, while my younger cohorts take to it like a plastic duck to bath water.

The important point is that this molding is caused in no small part by the outside world when we are younger, and two areas are particularly important: ‘imitation’, and ‘internalization and identification’. Imitation is just that, copying others either by imitating actions that produce perceived end goals – imitation of ends – or by imitating actions that produce and evaluation of an object or person –imitation of values.

The best way I can think to explain this is with a common or garden adolescent. Lets call her ‘Julie’ and her friend ‘Helen’ to protect the innocent.

Julie comes home one day and says, “Dad, I want to be a vegetarian?”

Dad looks at her with a mixture of confusion and respect, and replies, “okay, why?”

“because Helen became a vegetarian” says Julie “and she’s dropped two dress sizes [imitation of ends].”

“I see,” answers Dad, “is that really the best reason to become a vegetarian?”

“Well, I don’t want to be fat cus Helen says that fat people are disgusting [imitation of values]”.

“If Helen told you to jump of a cliff, would you do it?” asks dad.

(and the conversation, in which Dad tells Julie her ambition is laudable but she really ought to think harder about it continues…)

The other way the brain is shaped is through ‘Internalization and Identification’. This is when the child, through imitation, acquires aspects such as feelings, attitudes, beliefs, worldviews, and so on, from those around them. This is the structuring of a child’s cosmology and, through the action of what you might call cultural evolution, or memetics, or whatever, it evolves and alters slight from one generation to the next. What is particularly striking to Wrexler is how divergent types of psychology have all hit upon something similar ideas. I find this striking, too. From Freud’s ‘assimilation of one ego to another one’, behaviorist’s observations of the greater susceptibility of children to forms of conditioning, accelerated schemata (patterns of thought and behavior) development in cognitive psychology, the anthropology of learning through play and even observations of the workings of memes from people such as Susan Blackmore have led down similar paths. Indeed, that the adult seems to be created by the child is nothing new: recall the supposedly 17th century Jesuit quote: "Give me the child for seven years, and I will give you the man." Wrexler has simply added an observed neurological framework to this.


The second half of Wrexler’s book discusses the troubles this causes as these now less-plastic brains interact in the real world, not often for the best. This part of the book is very interesting, but it is not this that I am concerned with here.

Wrexler points out that ‘language itself is only realized through imitation’ [p.117]. Areas of the brain are linked to understanding and speech, such as Broca’s area and Wernicke’s area, but if these areas are damaged in children, the power of neuroplasticity simply construct language in another area. This is almost impossible in adults. Similar things happen with the young who become blind: the occipital lobe, now redundant, often appears to begin processing and enhancing other senses. It is absolutely untrue that we only use a part of our brains capacity; we use every wonderful bit of it in stunningly inventive ways.

But I digress.

 
Bringing these two books together, we have a scenario in which language is developed alongside our propsitional attitudes (beliefs, desires, doubts – in short the building blocks of a worldview) and is, in fact the best way in which propositional attitudes can be articulated one to another. If our language does not have an expression for blue, it is difficult for us to develop the belief in a concept of blue, just as we English speakers cannot conceive of light and dark blue as separate entities as the Russians can, or as apples as male and oranges as female as Germans and others can. Language is the tool by which our parents shaped our plastic neurons, and while we can understand other linguistic schemes, our own primary language will remain embedded in our sociocultural cores.



And now, some emotions.

This is a ‘what if’. What if emotions are similar. What if the neurochemicals within our bodies are innate, that there are universal and that certain sensations are felt by everybody, just as every healthy eye will see light with a wavelength of 462nm and a frequency of 645Thz (blue). These neurochemicals, just like the colours, are evolved responses, helping us to mate, remain in groups, avoid danger etc. These feelings, that I’ll call affect, are universal and felt by all. But, we can only understand them according the labels our cultures provide us with during adolescence. If a culture wants to call the neurochemical release that creates a need to run away by its own discrete term, say, flight, and another emotion for the need to defend yourself caused by similar sensations, say, fight, while another culture ties these similar chemical releases to both fleeing and preparedness for fighting, like the English fear, then they are different because the understandings are different. The English would have the emotional equivalent of grue, while our hypothetical would have green and blue. This doesn’t mean that grue, green, or blue don’t exist, nor does it mean that they are universal. They are understood by the cultures in the only they way they can; through the neural pathways created by developmental language.

This is a way to end the nature/nurture, psychology/culture dichotomy facing emotion research and in fact suggests an anthropological approach to studying emotions in history; using comparative and ethnographic techniques to see if, say, 15th century love and modern love, occupy the same place in the emotional spectrum and if not, what that can tell us about people and emotions, then and now.

Well, I enjoyed that. I’ll try to remain regular but for now, I’ll just tease you with the title of my next post: Problematising Probemlatisation.