Do you see what I see?

Last week I watched an episode of Horizon that I’d recorded “Do you see what I see?” In addition to the general interest I have in factual television, I also wondered if it could shed light on the way I think I see things differently and the problems I have had in translating a digital file from a camera into the way I saw the scene “in real life”.

Things trundled along for the first half of the programme before things really started to get interesting.  First there was an optical illusion that I followed but failed to get the expected results.  Then there was the section that shook me.

There appears to be a direct link between language and colour perception that is hard coded into the human brain.  The University of Surrey had done experiments where the perception of colour categories was compared between children of varying ages.  Before they learnt language the colour processing was done by the right hemisphere and it shifted to the left hemisphere as words were learnt – the left hemisphere being the dominant one for language.  This hard-wiring was illustrated by looking at the Himba people of Northern Namibia who have a much restricted vocabulary for colour categories – half of our (western) system of eleven.  They could easily distinguish between 2 different but very similar shades of green as they had different names, but found it hard to separate green from blue where they had the same name (something we would find easy) - see attached image

My head was buzzing when I went to bed but picked up the current book I’m reading (Look Me in the Eye by John Elder Robison) and started the last few pages when I came across this…

“The answer came to me last winter, during a visit with some brilliant researchers from Beth Israel Deaconess Medical Center, a part of Harvard Medical School. They’d been studying autism and the workings of the brain, and they gave me some startling insights.

It turns out that sentences are not formed in a single area of the brain. It’s far more complex than that. We form the concept of a sentence in one spot. Then we choose the verbs in another area and nouns in yet a third spot. The sentence is built in pieces throughout the brain, and then assembled into finished form.

For some reason, Aspergians like me experience “delays” in the transmission of those sentence fragments within the brain. That gives a slightly ragged cadence to our speech that’s quite distinct from that of normal speech. Once you begin listening for it, it’s quite recognizable.”

In the space of an hour or so I had gone from somebody troubled with a seemingly simple problem of not knowing how to process a digital camera file in Photoshop to someone knowing that the problem lay deeper and potentially one that was common to many people with Autism.  I suddenly saw myself the previous afternoon in an Autism Hub meeting attempting to explain my thoughts with eyes closed and imagined hands reaching out into my mind to try and find the words to fit the ideas (much like an old fashioned printer would pick the pieces of type to set the words ready for printing).

I knew that Autism may impact early learning of language but wasn’t aware of the different way that speech was processed later on in life, a difference in a way that (probably) affects our view of the world.  While I also knew that the Autistic mind was predominantly visual I had not considered that it could also see things differently. And I certainly wasn’t aware of the way that language and vision were connected, or disconnected.