In my previous posts I discussed using tactile displays (the tongue display and spatial-orientation vest) in order to sense the world in new ways. In both cases my immediate reaction was to close my eyes, which may seem foolish: I was cutting out visual information that might have been useful. However, there was method in my madness.
Vision dominates the other senses for spatial tasks and the information provided by the eyes can be misleading. At the Royal Air Force base in Henlow, for instance, pilots take courses on why you shouldn’t trust what you see through your night vision goggles NVGs) too much. To bring the message home, the pilots are shown a film of an accident caused by one pilot misunderstanding what his NVGs were telling him: that the other plane is feet away, not hundreds of feet. This problem is anything but theoretical.
Tactile displays like the one being tested at the Operator Performance Laboratory are intended to help pilots to get around problems with information from their other senses, particularly visual. However, there can be a problem if they conflict. Where the eyes are giving little information (just blue or black sky for a pilot, say) there is no problem. But what about when the two disagree? Even if you might know that the tactile information is more likely to be correct, will visual dominance override your reason?
This is a particularly difficult problem because, with some kinds of information (particularly the tongue-display attached to the camera) one could argue that the tactile information is visual. The fact that it’s coming in through the tongue is irrelevant. In this case, does the sense that the information comes in through matter more than it’s accuracy or relevance?
These are hard questions. Right now even don’t know for sure where the brain decodes the new kinds of information from the tongue (or skin for that matter), although studies in this area are planned. There is lots of speculation however. Some believe that it may depend on the type of information coming in through the display: so that with visual characteristics (i.e. in the form of images rather than some other sort of signal) might somehow be routed through the visual cortex. That could be why the information ‘feels’ so visual.
Another possibility is that the visual cortex is not involved: that the part of the brain that usually processes signals from the tongue adapts to do the job. In this case it the experience may feel visual because the bit of the brain that deals with visual imagination is harnessed as a way of ‘displaying’ the information acquired through the tongue: it ends up being incorporated into our world view as if it came through the eyes.
Either of these speculations (there may be many more) imply that the information is visual, which brings us back to the original question, but in a slightly different way. Now we’re not worried about whether the visual will dominate the tactile, but how the easy the brain will find it to pay attention to two completely different visual signals… More on this in the next post.
Originally posted on Books on Brains and Machines.