Ravensbourne Postgraduate | Peter Schaefer – Sonic Neurality
22609
post-template-default,single,single-post,postid-22609,single-format-gallery,ajax_fade,page_not_loaded,,select-theme-ver-3.4,wpb-js-composer js-comp-ver-4.12,vc_responsive

Peter Schaefer – Sonic Neurality

As Hollywood screenwriters threatened to strike back in April of this year, director Oscar Sharp and creative technologist Ross Goodwin were releasing a short film for the Sci-Fi London Film Festival’s 48 Hour Film Challenge. In a surreal marriage of fact and fiction, and starring David Hasselhoff to boot, It’s No Game was written entirely by an Articial Intelligence trained on film subtitle files. Goodwin has argued that in teaching AIs to write, “the computers don’t replace us any more than pianos replace pianists – in a certain way they become our pens, and we become more than writers. We become writers of writers” (Ross Goodwin, 2016).

In the same vein as Goodwin’s other experiments with human-machine perception and learning, Peter Schaefer’s MSc Interactive Digital Media project Sonic Neurality, though technically ambitious and conceptually rigorous, is also an artwork-generating artwork. The prototype of Sonic Neurality is a head-mounted device which creates an immersive narrated experience of the wearer’s surroundings interpreted by neural networks embedded within the headwear. Fed visual information from the wearer’s movements throughout their environment, while the narration is largely descriptive, the machine’s efforts to communicate this information to the wearer are surprisingly poetic.

Schaefer drew broadly from theories of postphenomenology, mediation, embodiment and augmented perception in order to research how our experience of reality is shaped by technology and how intelligent technologies might therefore be used to enhance our sense of presence in the world.

In Sharp & Goodwin’s It’s No Game, the final scene sees the emotional disintegration of the “Hoffbot” (the ‘real’ David Hasselhoff acting out lines generated by an algorithim trained on dialogue from the series’ Knight Rider and Baywatch). Beneath a glitzy smoking jacket, he’s still wearing those infamous red Baywatch beachguard shorts, and sobbing “I wanna be a man”. The knowledge that these words have been scripted by a machine makes this bizarre scene no less touching.

At an exciting juncture where recent leaps in AI research urge us to consider what really makes us human, Sonic Neurality foregrounds creative potential as a defining feature. Like Goodwin, Schaefer sees cause for optimism, rather than fear, in these new tools.

Schaefer describes how Sonic Neurality allows the user some “insight into what it is like to perceive the world from the perspective of a machine” and that it is the complexity of how that information is processed which makes it impossible for wearers to understand how it works exactly. This, he says, “reinforces the impression that the machine has its own experience of the world”.

It is, then, the user’s human imagination that remains fundamental in the circuitry of human-machine perception.