Electrotactile maps
Electrotactile Maps is the 3rd in a series of audiovisual works.
Electrotactile Maps
Audiovisual
I gathered the inspiration for this piece when listening to a podcast. The podcast spoke about a technology that allowed profoundly blind people to “see” using sensory substitution. A pair of sunglasses has a camera attached. The camera sends its signal to an array of 144 electrodes that are placed on the tongue. Details about this can be found at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5523951/
In Electrotactile Maps, I tried to embody the process of adapting to a “The tongue display unit (TDU) for electrotactile spatiotemporal pattern presentation” as it is called in the above paper. Electrotactile Maps is constructed from different iterations of a pattern unfolding in time. Each iteration would change parameters of a system in a different way and/or the system would be augmented.
In the first section, I used a recursive process that was fed visual noise as input. The feedback of this visual noise creates of flowing field that drifts across the screen. Eventually, the feedback resampled (quantized) into larger segments and those larger segments are resampled (quantized) etc. until the noise floods the screen. This all occurs as two-dimenstional space, or simply informing pixels themselves on the screen (i.e video). As the piece moves forward, it applies the same processes to three-dimenstional space, which then is rendered to the screen. This level of indirection creates an interesting mapping of the time-based pattern I mentioned earlier. In a narrative sense, the metaphor of sensory substitution is happening in this process, where a two-dimenstional grid begins to take on three-dimensional shapes.
I was really quite fascinated by this idea of sensory substitution, as it seemed to have some commonalities to my goals in creating audiovisual works. I am looking for deep links between audio and visual elements in these works in a synaesthetic way. In this piece, that was accomplished intuitively. The pattern that shapes the time of events for the visual system is the driving force of the musical flow. The visuals were generally created first. In order to achieve good resolution, this was done in non-real-time. I began assembling the rendered videos in a video editor. Then I primarily used a Tempest Analog Drum machine patched through a Sherman Filter Bank or an OTO Biscuit (bit mangler, filter, distortion, and some other sound transformations) hardware device. SuperCollider provided some of the sonic material, but most of it comes from the Tempest. Using the Tempest allowed me to have good real-time control of the audio signal and to work fluidly. The contraints of the device, and the fact that I wasn’t using it as a drum machine at all, were good at focusing my efforts.