Bottom Line
There will only be 10 places available for this in-person workshop.
This workshop is about exploring digital musical instruments that intended to be played by more than one person. Working in pairs you will create synthesis systems for specific two player interfaces that will be provided for you. These interfaces have emerged from a period of practice research onto applying theories of post-human entanglement to multi-player musical instruments.
Each pair will have a GameTrak interface, connected to a Raspberry Pi Zero. The Raspberry Pi will generate the entangled player data for you to use in your synthesis. You will receive this OpenSoundControl formatted data via an ethernet cable.
You will need:
– a working knowledge in MAX/MSP, PureData, SuperCollider or VCV Rack (you can choose with to use) or be paired with someone who does.
– a laptop – with your chosen software, 3.5mm audio output and ethernet (either built in or via an adapter)
– headphones (I will bring splitter cables so you can share the audio output)
– ideas or pre-made noise making systems to experiment with (typically with 6 or more control inputs ranging from 0 to 1.0)
I will provide:
– a GameTrak and interface data controller (running on a Raspberry Pi Zero)
– A number of entangled interfaces on the Pi Zero to choose from and a OSC decoding patches/or SClang code for each.
Resources
– GitHub for downloadable examples (under development)
NIME Webpage text.
Musical instruments are typically played by one person only. There are practical and ergonomic reasons for this in traditional instruments, however the development of digital musical instruments has questioned this relationship. As well as creating opportunities to inter-connect performers’ systems, technology has opened a field of entangled instruments; where the players are enmeshed enough to be considered as forming a single, mutually played, instrument.
Although a developed discourse around post-human thought and theories of entanglement within HCI has emerged, there remains little exploration of its practical application to digital instrument design. There is even less research into applying entanglement to multi-player instruments. This workshop is an exploration of the outcomes of practice research into creating such entangled instruments.
The workshop starts with playing and reflecting on a series of entangled instruments, for example the Stickatron and Elastiphone (see linked webpage and media). Comparing instruments and participants’ responses to them is arguably a key element in an instrument’s development, however his can be problematic when the relative nature of experience and situated knowledges are taken into account. This workshop will, in part, explore how diffractive analysis (reading data through the lenses of two frameworks) can alleviate this impasse, and lead to useful insights that support the co-construction of meaning. Highlighting that the entanglement of observed data for a specific instrument’s arrangement is part of a wider spectrum of possibilities; the workshop explores the interplay between the construction of specific entangled instruments, and the experience of playing them, through three diffractions (identifying these as connection, engagement and mapping).
We will overlay observations against individuality and collectiveness to interrogate connection and use interactive (literally between) and intra-active (literally within) frameworks to explore engagement. Finally, we will explore how these two diffractions interplay with representational and interpretative mapping strategies. Here, interpretive mapping is taken to be systems that leave the creation of meaning between user action and synthesis control ambiguous; unlike representational correlations such as a real or virtual keyboard, or a vertical movement increasing pitch for example.
Note that the workshop will take a praxis approach, based around playing and making synthesis systems for given interfaces of entanglement. A reasonable level of technical expertise in audio programming is therefore required. Participants will work in pairs with each pair will having their own GameTrak controller. A Raspberry Pi will take raw HID data and broadcast real-world data, based on specific entanglements. The Pi will broadcast Open Sound Control formatted data via its ethernet port. Starting examples will be provided for PureData, Max/MSP, VCV Rack2 (including Cardinal) and SuperCcollider. The Entangled Instruments, at the heart of the workshop, can be scaled to suit many levels of accessibility given advance notification. Participants are requested to bring headphones to minimise the noise chaos within the space (splitters will be provided). However, although this will help control overwhelm for some it might not be appropriate for all participants. We will all become entangled within the workshop.