Research Data Leeds Repository

Color is the Keyboard

Citation

Schedel, Margaret (2016) Color is the Keyboard. University of Leeds. [Dataset] https://doi.org/10.5518/160/26

This item is part of the Alternative Histories of Electronic Music collection.

Dataset description

In this talk, I will present a series of diverse case studies covering the spectrum of techniques and used to transcode visual data into auditory signals. Some of these algorithms are simple analog electro-mechanical devices, while others are complex programs that perform calculations, process data and make logical (or even illogical!) decisions. At its most basic definition, and algorithm is a set of instructions, some of these heuristic techniques are included for the historical context of translating image to sound. Talking about transcoding necessitates the use of analogous language; the writer, historian and philosopher François-Marie Arouet, more famously known by his nom de plume Voltaire, fully accepted the similarity between tones and colours, writing “this secret analogy between light and sound leads one to suspect that all things in nature have their hidden rapports, which perhaps some day will be discovered.” While preparing his book on the Newtonian world view, Élémens de la philosophie de Neuton, Voltaire corresponded with the inventor of the Ocular Organ, Louis Bertrand Castel. Although it was never built, the ocular organ can be seen as a prototypical synesthetic algorithmic instrument, meant to generate simultaneous visual and sonic material by changing the mechanism of a harpsichord so that "the pressing of the keys would bring out the colours with their combinations and their chords; in one word, with all their harmony, which would correspond exactly to that of any kind of music.” Transcoding is a sort of extreme analogy, where we establish complete correspondence based on transformations between entities.” Often authors speak of “mapping” features from one domain to another. In these case studies, when possible, I will indicate how features in the visual arena control aspects of the resultant sound. Unfortunately for some of the more complex systems or older machines, there is not enough data to fully describe the algorithmic process of transcoding from the visual to the sonic. While synesthesia is an extreme form of stimuli becoming interconnected, human thought is quite generally founded on the concept of connectivity and comparison. Language overflows with metaphors and analogies precisely because humans learn best by comparing new concepts with established ones; integrating new thoughts as reformulations of older ones. Algorithmic transcoding is thus a potent method for illuminating both inputs and outputs. As Nietzsche describes it: “Everything which distinguishes man from the animals depends upon this ability to volatilize perceptual metaphors in a schema, and thus to dissolve an image into a concept.” The musicians, artists and inventors in these case studies conceived of metaphors of expression, created algorithms to transcode data and thus dissolved images into sound. This talk will cover 1) Light Bulbs, Then and Now: The Rhythmicon and Thermal Image; 2) Sound-on-Film: Fischinger, McLaren, Whitney Brothers, Spinello, and Sholpo; 3) Drawing Sound: Oramics, UPIC, and Metasynth, 4) Glitch: Similacra, and Pixel Player; 5) Slit Scanning: Phonopaper, ANS; 6) Image as Control: Graphic Converter, Augur and Light Pattern; and 6) Live Video and Design: Hearing Red, Ocusonics, and Giant Theremin.

Subjects: W000 - Creative arts & design > W300 - Music
W000 - Creative arts & design > W300 - Music > W310 - Musicianship/performance studies > W316 - Electronic/electro-acoustic music performance
Divisions: Faculty of Arts, Humanities and Cultures > School of Music
Related resources:
LocationType
https://eprints.whiterose.ac.uk/119074/Publication
https://doi.org/10.1017/S135577181700005XPublication
https://hughdaviesproject.wordpress.com/Website
Date deposited: 27 Jul 2017 19:28
URI: https://archive.researchdata.leeds.ac.uk/id/eprint/208

Files

Data

Research Data Leeds Repository is powered by EPrints
Copyright © University of Leeds