Not sure if the link will work, but I just received a link to this article via a multisensory research group mailing list. As I understand it, these researchers are exploring the way that multisensory information is used in the visual/verbal legal systems – making the interesting point that the prevailing legal paradigm is verbocentric, and recognises visual, but not other sensory phenomena. Only visual sensory stimuli tend to be researched (eg visual aspects of eye witness testimony) but visual, auditory, tactile and kinaesthetic information are all becoming more relevant as legal phenomena. The paper proposes that multisensory legal phenomena should be explored along with a concept of multisensory law, and that this should not just be in the context of ICT-based sensor information (biometric data, facial recognition, digital enhancements etc etc).
Not sure what to think of it, but it is interesting.