(John Stuart Reid)
Dolphins and Cymatics
Animal communication has been studied for decades although reliably
identifying animal sounds has proven to be extraordinarily difficult.
However, some good work has been achieved in this area and recent
research in the field of dolphin communication, using the CymaScope, is
beginning to show promise, as the following article describes.
Related Why Sound Heals | Cymatics, Entrainment, Mandalas, Frequencies and DNA
Source – Aether Force
by John Stuart Reid, June 2nd, 2017
An Image of a Submerged Man via Cymatic-Holographic Imaging Technique
A dolphin’s echolocation beam was directed at a submerged man and the echo captured by a hydrophone system. The echo signal was sent to CymaScope.com who created the first ever ‘what-the-dolphin-saw’ image of the submerged man, by using a cymatic-holographic imaging technique.
CymaScope.com in collaboration with SpeakDolphin.com have made a significant breakthrough in imaging a submerged man from the echolocation beam transmitted by a dolphin. The resulting image is faint but following enhancement techniques key features of the man and background are revealed. The research took place at the Dolphin Discovery Centre in Puerto Aventuras, Mexico. The submerged man was Jim McDonough, and the female research dolphin, Amaya, was tasked to echolocate upon the man, to ‘see’ Jim with its sound-vision sense.
John Stuart Reid, who captured the image in CymaScope video mode said, “What is most exciting about the video is that it contains two consecutive frames in which Jim’s arm is seen in two different positions, inferring that if we had a sound file containing a longer series of dolphin clicks we may be able to capture more frames. In a sense we would be sharing in the realtime ‘movie’ of what the dolphin saw, a very exciting prospect.”
Team leader, Jack Kassewitz of SpeakDolphin.com, is delighted with the result, “This is the first time we have captured a what-the-dolphin-saw image of a submerged man. We employed a similar technique in 2012 to capture a dolphin’s echolocation picture of a flowerpot and several other submerged plastic objects but the present research has confirmed that result and so much more.”
Jim McDonough wore a weight belt and exhaled most of the air in his lungs to overcome his natural buoyancy, then arranged himself against a shelf in the research pool. It was decided not to use a breathing apparatus, to ensure that there would be no bubbles to adversely affect the results of the experiment, therefore, the whole event had to be accomplished within a single breath.
With Jim in position the female research dolphin, Amaya, was tasked to echolocate upon the man, to ‘see’ Jim with its sound-vision sense. Most of the resulting echo from his body was reflected back to Amaya but one of water’s sonic properties is similar to that of air, causing sounds to diffract in many directions simultaneously. One of the theoretical implications of this property of water is that when a dolphin sees an object with its sound-vision sense it is possible that all other dolphins in the near vicinity will also receive the image. This effect could have profound benefits for a pod of dolphins. In experimental set ups it has been found that the hydrophone (high frequency microphone) can be positioned almost anywhere in the vicinity of the target object or dolphin. A further finding of the experimental set up is that is not necessary to collect the whole of the dolphin’s reflected echo beam. All parts of the echo contain quasi-holographic sonic data that represents the object and from which an image of the object can be created. Light provides a good analogy to this basic principle: when light reflects from an object it contains data that represents the object; any small part of the reflected light contains an image of the object as viewed from a particular angle.
The dolphin’s echo signal was recorded using high specification audio equipment, by Alex Green and Toni Saul, and sent via email to the CymaScope laboratory in the UK. Acoustic physics researcher, John Stuart Reid, heads the CymaScope team, “When I received the recording Jack had told me only that it might contain an echolocation reflection from someone’s face. I noticed the file name “Jim” so I assumed that the image, if it existed within the file, would be that of a man’s face. I was somewhat dubious whether this could be achieved because the imaging we had carried out in 2012 was of simple plastic objects that had no inherent detail, whereas a face is a highly detailed form. I listened to the file and heard an interesting structure of clicks. The basic principle of the CymaScope instrument is that it transcribes sonic periodicities to water wavelet periodicties, in other words, the sound sample is imprinted onto a water membrane. The ability of the CymaScope to capture what-the-dolphin-saw images relates to the quasi-holographic properties of sound and its relationship with water, which will be described in a forthcoming science paper on this subject. When I injected the click train into the CymaScope, while running the camera in video mode, I saw a fleeting shape on the water’s surface that did not resemble a face. I replayed the video, frame by frame and saw something entirely unexpected, the faint outline of a man. At this point I sent the image to Jack, with a note that simply read, ‘this frame has what appears to be the fuzzy silhouette of almost a full man. No face.“
When Jack Kassewitz received the still image it turned out that he was just as surprised as Reid because he had been unaware that Amaya had been echolocating on Jim as she approached him from several feet away. Kassewitz commented, “I was astonished when I received the faint image of Jim as I had no idea that Amaya had been echolocating during its in-run. I called John in the UK and we discussed the image in detail. Later, he sent me a computer enhanced version that revealed several details not easily seen in the raw image, such as the weight belt worn by Jim. Having demonstrated that the CymaScope can capture what-the-dolphin-saw images our research infers that dolphins can at least see the full silhouette of an object with their echolocation sound sense, but the fact that we can just make out the weight belt worn by Jim in our what-the-dolphin-saw image suggests that dolphins can see surface features too. The dolphin has had around fifty million years to evolve its echolocation sense whereas marine biologists have studied the physiology of cetaceans for only around five decades and I have worked with John Stuart Reid for barely five years. Even so, our recent success has left us all speechless. We now think it is safe to speculate that dolphins may employ a “sono-pictorial” form of language, a language of pictures that they share with each other. If that proves to be true an exciting future lies ahead for inter species communications.”
A video of “what the dolphin saw” can be viewed on the CymaScope YouTube channel:
The faint image of Jim McDonough, as imaged on the CymaScope can be likened to early experiments with photography, such as those by Louis Daguerre. Future improvements in the Cymatic-Holographic imaging technique will bring us ever closer to what-the-dolphin-sees.
Researchers in the United States and Great Britain have made a significant breakthrough in deciphering dolphin language in which a series of eight objects have been sonically identified by dolphins.
Team leader, Jack Kassewitz of SpeakDolphin.com, ‘spoke’ to dolphins with the dolphin’s own sound picture words. Dolphins in two separate research centers understood the words, presenting convincing evidence that dolphins employ a universal “sono-pictorial” language of communication
The team was able to teach the dolphins simple and complex sentences involving nouns and verbs, revealing that dolphins comprehend elements of human language, as well as having a complex visual language of their own. Kassewitz commented, “We are beginning to understand the visual aspects of their language, for example in the identification of eight dolphin visual sounds for nouns, recorded by hydrophone as the dolphins echolocated on a range of submersed plastic objects.”
The British member of the research team, John Stuart Reid, used a CymaScope instrument, a device that makes sound visible, to gain a better understanding of how dolphins see with sound. He imaged a series of the test objects as sono-pictorially created by one of the research dolphins.
In his bid to “speak dolphin” Jack Kassewitz of SpeakDolphin.com, based in Miami, Florida, designed an experiment in which he recorded dolphin echolocation sounds as they reflected off a range of eight submersed objects, including a plastic cube, a toy duck and a flowerpot. He discovered that the reflected sounds actually contain sound pictures and when replayed to the dolphin in the form of a game, the dolphin was able to identify the objects with 86% accuracy, providing evidence that dolphins understand echolocation sounds as pictures. Kassewitz then drove to a different facility and replayed the sound pictures to a dolphin that had not previously experienced them. The second dolphin identified the objects with a similar high success rate, confirming that dolphins possess a sono-pictorial form of communication. It has been suspected by some researchers that dolphins employ a sono-visual sense to ‘photograph’ (in sound) a predator approaching their family pod, in order to beam the picture to other members of their pod, alerting them of danger. In this scenario it is assumed that the picture of the predator will be perceived in the mind’s eye of the other dolphins.
When Reid imaged the reflected echolocation sounds on the CymaScope it became possible for the first time to see the sono-pictorial images that the dolphin created. The resulting pictures resemble typical ultrasound images seen in hospitals. Reid explained: “When a dolphin scans an object with its high frequency sound beam, emitted in the form of short clicks, each click captures a still image, similar to a camera taking photographs. Each dolphin click is a pulse of pure sound that becomes modulated by the shape of the object. In other words, the pulse of reflected sound contains a semi-holographic representation of the object. A portion of the reflected sound is collected by the dolphin’s lower jaw, its mandible, where it travels through twin fat-filled ‘acoustic horns’ to the dolphin’s inner ears to create the sono-pictorial image.”
The precise mechanism concerning how the sonic image is ‘read’ by the cochleae is still unknown but the team’s present hypothesis is that each click-pulse causes the image to momentarily manifest on the basilar and tectorial membranes, thin sheets of tissue situated in the heart of each cochlea. Microscopic cilia connect with the tectorial membrane and ‘read’ the shape of the imprint, creating a composite electrical signal representing the object’s shape. This electrical signal travels to the brain via the cochlea nerve and is interpreted as an image. (The example in the graphic shows a flowerpot.) The team postulates that dolphins are able to perceive stereoscopically with their sound imaging sense. Since the dolphin emits long trains of click-pulses it is believed that it has persistence of sono-pictorial perception, analogous to video playback in which a series of still frames are viewed as moving images.
Reid said, “The CymaScope imaging technique substitutes a circular water membrane for the dolphin’s tectorial, gel-like membrane and a camera for the dolphin’s brain. We image the sono-picture as it imprints on the surface tension of water, a technique we call ‘bio-cymatic imaging,’ capturing the picture before it expands to the boundary. We think that something similar happens in the dolphin’s cochleae where the sonic image, contained in the reflected click-pulse, travels as a surface acoustic wave along the basilar and tectorial membranes and imprints in an area that relates to the carrier frequency of the click-pulse. With our bio-cymatic imaging technique we believe we see a similar image to that which the dolphin sees when it scans an object with sound. In the flowerpot image the hand of the person holding it can even be seen. The images are rather fuzzy at present but we hope to enhance the technique in future.”
Dr Horace Dobbs is Director of International Dolphin Watch and a leading authority on dolphin-assisted therapy. “I find the dolphin mechanism for sonic imaging proposed by Jack Kassewitz and John Stuart Reid plausible from a scientific standpoint. I have long maintained that dolphins have a sono-visual language so I am naturally gratified that this latest research has produced a rational explanation and experimental data to verify my conjectures. As early as 1994, in a book I wrote for children, Dilo and the Call of the Deep, I referred to Dilo’s ‘Magic Sound’ as the method by which Dilo and his mother pass information between each other using sonic imaging, not just of external visual appearances, but also of internal structures and organs.”
As a result of Reid’s bio-cymatic imaging technique Kassewitz, in collaboration with research intern Christopher Brown, of the University of Central Florida, is beginning to develop a new model of dolphin language that they are calling Sono-Pictorial Exo-holographic Language, (SPEL). Kassewitz explained, “The ‘exo-holographic’ part of the acronym derives from the fact that the dolphin pictorial language is actually propagated all around the dolphin whenever one or more dolphins in the pod send or receive sono-pictures. John Stuart Reid has found that any small part of the dolphin’s reflected echolocation beam contains all the data needed to recreate the image cymatically in the laboratory or, he postulates, in the dolphin’s brain. Our new model of dolphin language is one in which dolphins can not only send and receive pictures of objects around them but can create entirely new sono-pictures simply by imagining what they want to communicate. It is perhaps challenging for us as humans to step outside our symbolic thought processes to truly appreciate the dolphin’s world in which, we believe, pictorial rather than symbolic thoughts are king. Our personal biases, beliefs, ideologies, and memories penetrate and encompass all of our communication, including our description and understanding of something devoid of symbols, such as SPEL. Dolphins appear to have leap-frogged human symbolic language and instead have evolved a form of communication outside the human evolutionary path. In a sense we now have a ‘Rosetta Stone’ that will allow us to tap into their world in a way we could not have even conceived just a year ago. The old adage, ‘a picture speaks a thousand words’ suddenly takes on a whole new meaning.”
David M. Cole, founder of the The AquaThought Foundation, a research organization that studied human-dolphin interaction for more than a decade said, “Kassewitz and Reid have contributed a novel model for dolphins’ sonic perception, which almost certainly evolved out of the creature’s need to perceive its underwater world when vision was inhibited. Several conventional linguistic approaches to understanding dolphin communication have dead-ended in the last 20 years so it is refreshing to see this new and highly-nuanced paradigm being explored.”
The human capacity for language involves the acquisition and use of a complex system of vocal sounds to which we attribute specific meanings. Language, the relationship between sounds and meanings evolved differently for each tribe of humans and for each nation. It is generally believed that the human language faculty is fundamentally different from that of other species and of a much higher complexity.
The development of vocal language is believed to have coincided with an increase in brain volume. Many researchers have wondered why dolphins have brains comparable in size with those of humans, considering that Nature creates organs according to need. The Kassewitz team’s findings suggest the large dolphin brain is necessary for the acquisition and utilization of a sono-pictorial language that requires significant brain mass.
Dolphins enjoy constant auditory and visual stimulation throughout their lives, a fact that may contribute to their hemispheric brain coordination. The dolphin’s auditory neocortical fields extend far into the midbrain, influencing the motor areas in such a way as to allow the smooth regulation of sound-induced motor activity as well as sophisticated phonation needed for production of signature whistles and sono-pictures. These advantages are powered not only by a brain that is comparable in size to that of ahuman but also by a brain stem transmission time that is considerably faster than the human brain.
Kassewitz said, “Our research has provided an answer to an age-old question highlighted by Dr Jill Tarter of the SETI Institute, ‘Are we alone?’ We can now unequivocally answer, ‘no.’ SETI’s search for non-human intelligence in outer space has been found right here on earth in the graceful form of dolphins.”
Full results of this research are available on request from Jack Kassewitz.
Jack Kassewitz: [email protected]
305-807-5812 – Miami, Florida
Stillness in the Storm Editor’s note: Did you find a spelling error or grammar mistake? Do you think this article needs a correction or update? Or do you just have some feedback? Send us an email at [email protected] with the error, headline and url. Thank you for reading.