Interrogating Entangled Instruments

Musical instruments are typically played by one person only. There are practical and ergonomic reasons for this in traditional instruments, however the development of digital musical instruments has questioned this relationship.  As well as creating opportunities to inter-connect performers’ systems, technology has opened a field of entangled instruments; where the players are enmeshed enough to be considered as forming a single, mutually played, instrument. 

Although a developed discourse around post-human thought and theories of entanglement within HCI has emerged, there remains little exploration of its practical application to digital instrument design.  There is even less research into applying entanglement to multi-player instruments.  This workshop is an exploration of the outcomes of practice research into creating such entangled instruments.

The workshop starts with playing and reflecting on a series of entangled instruments, for example the Stickatron and Elastiphone (see linked webpage and media). Comparing instruments and participants’ responses to them is arguably a key element in an instrument’s development, however his can be problematic when the relative nature of experience and situated knowledges are taken into account. This workshop will, in part, explore how diffractive analysis (reading data through the lenses of two frameworks) can alleviate this impasse, and lead to useful insights that support the co-construction of meaning.  Highlighting that the entanglement of observed data for a specific instrument’s arrangement is part of a wider spectrum of possibilities; the workshop explores the interplay between the construction of specific entangled instruments, and the experience of playing them, through three diffractions (identifying these as connection, engagement and mapping).

We will overlay observations against individuality and collectiveness to interrogate connection and use interactive (literally between) and intra-active (literally within) frameworks to explore engagement.  Finally, we will explore how these two diffractions interplay with representational and interpretative mapping strategies. Here, interpretive mapping is taken to be systems that leave the creation of meaning between user action and synthesis control ambiguous; unlike representational correlations such as a real or virtual keyboard, or a vertical movement increasing pitch for example.

Note that the workshop will take a praxis approach, based around playing and making synthesis systems for given interfaces of entanglement.  A reasonable level of technical expertise in audio programming is therefore required.  Participants will work in pairs with each pair will having their own GameTrak controller. A Raspberry Pi will take raw HID data and broadcast real-world data, based on specific entanglements. The Pi will broadcast Open Sound Control formatted data via its ethernet port.  Starting examples will be provided for PureData, Max/MSP, VCV Rack2 (including Cardinal) and SuperCcollider. The Entangled Instruments, at the heart of the workshop, can be scaled to suit many levels of accessibility given advance notification. Participants are requested to bring headphones to minimise the noise chaos within the space (splitters will be provided). However, although this will help control overwhelm for some it might not be appropriate for all participants. We will all become entangled within the workshop.