This post helps highlight the concept of ‘Green Language’ wherein words can be decoded using phonic interpretations. For example the word ‘Incorporate’ can be broken down in the following way: In-Corp-Orate = In (inner) Corp (Corporal – of the real or in density) Orate (to speak), therefor the Green Language meaning is: to make real the inner world via speaking into reality. There are many, many such words woven into the English Language. While some may argue lack of valid connection to common definitions, this data is irrefutable because the relationships exist regardless of your willingness to acknowledge them.
For more on this hidden language see the post Syncretism With Santos Bonacci – 15 December 2013 – Consciousness Expanding (Video).
Scientists at the University of California at San Francisco (UCSF) believe they have identified at area of the brain that interprets the spoken language.
Understanding how the brain processes speech is expected to lead to artificial speaking devices such as the Apple Siri , assist in treating dyslexia and other language and communication disorders.
The ears take in sound vibrations and this is converted to electrical impulses which the brain decodes and correlates to known words to understand what is being said.
This complex procedure happens effortlessly over and over without fail – barring a disease, accident or deformity that causes a person to become deaf.
This area of the brain makes sense out of the sounds we hear spoken with cells that respond to dozens of units of sound that make up the words we use.
Six volunteers who were admitted into the hospital for epilepsy surgery agreed to allow researchers from UCSF implant a recording device on the surface area of the brains.
The participants listened to recorded voices speaking phrases which lead to the team having a recording of how the participant’s brains react to sounds used in the English language.
This specialized device monitored activity from groups of brain cells to locate the Brodmann area 22; or area previously identified as explaining words to the brain.
Edward Change, brain surgeon and co-author of the study explained : “We were shocked to see this kind of selectivity. The results help explain how we can process speech so quickly and accurately, even in a noisy place, or when the speaker has an unfamiliar accent. It’s the starting point of thinking about how to build up some better understanding of how language occurs in the brain.”
These cells were observed responding to phonic features rather than chucks of sound.
The study shows that phonics would be the actual building blocks of speech and languages.
David Poeppel, psychology professor at Columbia University, commented on the study’s findings: “This reveals the mechanics behind one of the first steps in processing language in the brain.”
Poeppel said: “Imagine how many different things have to happen for you just to understand the sentence: ‘I need a cup of coffee.’ First of all you have to identify all the different sounds in the background that you don’t want. You have to break [the stream of sound] into units. You have to look up the words. You have to combine the words and generate the correct meaning. And each of those parts has its own subroutines.”