The Black Box project is a project that involves Pierre Alexandre Tremblay on bass/electronics, Patrick Saint-Denis on robotics/electronics, Sylvain Pohu on guitar/electronics, and myself on drums/electronics. It’s a four-way collaboration that has gone through two residencies (in Montreal and Huddersfield), to work out the finer details of putting a large-scale show together.
It looks and sounds something like this:
This project represents several firsts for me. It’s my first project working with so many hybrid (something + electronics) performers (particularly ones that were all programmers (particularly using different programming languages (Max, SuperCollider, openFrameworks))), and it’s the first project that involves extended residencies, which I generally avoid.
I knew most of the people involved, having worked with PA on many projects over the years, and having shared a stage with Sylvain many years ago, but did not know Patrick, whose contribution to the project (robotic accordions!) wasn’t insignificant. I was excited to work with Patrick as I had seen some of his work with Myo controllers and knew he worked extensively with robotics, plus he also worked as a visual artist, something I also did.
By the time I got involved in the project there was already some discussions about what/how/when. Nothing had congealed yet, but some ideas that we went with were already there, like robotics, overall large-scale-ness, and projectors/cameras. We quickly discovered that since we are all spread across the world (Manchester, Huddersfield, Montreal), discussing things via email, offset by different time zones, wasn’t the best way to foster a creative discussion. To solve this, PA set up a Trello board for us to post ideas to. I had never used Trello before, but it proved to be a useful tool, at the start, for brainstorming creative ideas in a threaded/forum-like manner.
Here are some of the discussion points and threads from our Trello board:
And a glimpse into one of the individual topics:
These posts got some initial discussions going, but things really came together when we all met up in person in Montreal in June 2016.
During the initial residency we brainstormed and jammed together, a lot. We had pretty much two weeks of 10-5 days to work on stuff, so we explored all sorts of musical and technical ideas, some of which I’ll discuss in more detail below. This exploratory freedom was nice, especially since the other musicians were not only musically and technically talented, but all worked very differently from me (and each other). That kind of ‘soft’ friction proved to be very fruitful to the creative process, in that ideas were bounced around quite different brains the whole time.
In addition to all the musical ideas, we had access to multiple projectors, TVs, and lights (no RGB DMX lights yet). There were many possibilities to explore so we spent much of this initial residency brainstorming the kind of visuals we would like, and how they would interact with the music. One of the initial ideas we had in our Trello brainstorms was using webcams like microscopes on the different instruments. In the end we arrived at a more zoomed out version of that, but here is an early test using a webcam laying on the snare drum, being projected onto a TV:
This worked well in that with all the cables and flashing lights, it creates an abstract magnification of what’s going on with the feedback performance technique, which is often based around tiny movements. For reasons that I discuss further down, this approach was abandoned, but it is something I will look to revisit in the future.
One idea that was discussed at great length early on, was the function of projection surfaces, specifically whether they should be used representationally or not. As in, projecting us, as performers, onto other surfaces. I was initially opposed to this kind of use of the projectors as I felt that the deliberate mediation of us as performers (in a performance context) would create a barrier, distancing the performer, rather than the intended effect of bringing the audience closer in. (I feel like a music stand, or even a microphone can have the same detrimental effect in some cases) In the end we do use some representational projections, but with severe lighting, and the images projected over top of us, so there isn’t a rectangle on the wall at all.
Once the musical and visual/reactive ideas started coming together, Sylvain proposed that we keep track of things on a gigantic/long piece of brown paper. For all my hi-tech tomfoolery, I often work on paper as well, though it’s a bit different with a huge sheet of paper covered in 4 different handwritings in 2 languages!
From this we were able to define and refine some of the ideas we were developing. These had some early (and unfortunate!) labels like:
- ambient (sines + feedback)
- scratchy dirty gestural whoosh bang
- dislocated groove + autotune
- concat resynthesis + resynthesis + resynthesis
- video on/off gesture (amplify)
- lights + 3 video sources
So at the end of the Montreal residency we had about a dozen ideas/scenes/interactions to think about. And since we were quite organized with audio/video recording what we did, we were able to further distill the ideas down by the time we got to our second residency in Huddersfield in September 2016.
The second residency was focused on further refining the ideas we had developed in Montreal, creating an overall structure for the piece, and laying the technical foundation for how we would interface with all the aspects of the show.
Of the ideas listed above, most of those, along with several new ones, made their way onto sheets of paper which we laid across the floor. Over the course of the residency we reordered these sheets of paper until we came up with a structure we were happy with (or at least, disagreed minimally on!).
Along side the micro (material detail) and macro (formal/structural) work we were doing, we spent a long time on defining the overall “instrument” we were playing, which was made up of multiple acoustic and electronic instruments, processes, and visuals.
Underpinning all of this was a dedicated server machine (a cheesegrater Mac Pro), and a network protocol that we used to send messages to and from every performer’s machine and the server. The purpose of setting up a dedicated server was to simplify the whole process so, although communication was bidirectional for the purposes of audio analysis feedback, everything was fed to the server machine. (an important exception to this was the robotic accordion control, which was routed back through Patrick’s machine so he could have a ‘kill switch’ in case something went wrong)
This server/client model additionally helped to streamline the compositional and structural thinking when working with so much equipment. And there’s so much equipment. How much equipment? This much equipment:
And aside from each of our own (formidable!) setups, we have the following scattered around/above us:
- 4 robotic accordions
- 3 HD projectors
- 3 HD webcams
- 4 RGB LED DMX lamps at floor level
- 4 DMX spot lights above
- 24 DMX controlled LED car headlights
Making musical sense of all this equipment was one of the most exciting, interesting, and challenging aspects of the whole project. And having such a robust and stable system/framework underpinning everything allowed us to be more fluid with that creative thinking.
At the end of the Huddersfield residency we had around 40 minutes of music that flowed and worked relatively well. Things aren’t finished yet, but there is a shape emerging. An overall arc and container which we still have a lot of local-level freedom in which to improvise.
One of my favorite aspects of this whole project has been the ability to sit down and experiment with ideas without having to worry whether they worked well, or fit with the project at all. Basically able to just explore wild tangents, in an in-depth manner. This section details some of the ideas I(/we) worked on during these initial residencies, some of which worked and some which didn’t.
One of the first things we played with was the idea of creating a massive hybrid instrument using the accordions as extensions of our instruments, using resynthesis. This is something that both PA and I have worked with extensively in our own music, with PA’s work being an inspiration when I built C-C-Combine, a corpus-based audio mosaicking patch. For the residency in Montreal I also brought my Grassi Box and a ciat-lonbarde Fourses to experiment with hardware-based resynthesis.
Although we abandoned the hardware-based resynthesis early on, here is an early experiment in which both software and hardware resynthesis is taking place based on audio analysis of the drums. The Grassi Box is controlling the Fourses using a pre-analyzed set of connections on the synths (something I discuss in more detail on the Grassi Box page), and PA’s computer is running a concatenative synthesis patch which is using a corpus we made of accordion noises (key presses, squeaks, air, and a small amount of sustained notes). (click here to download the accordion corpus and analysis file that can be loaded into The Party Van or C-C-Combine)
Although we abandoned the hardware-based resynthesis for this project, this is something I’m excited about developing further by finally dipping my toes into the world of modular synths by using Expert-Sleepers modules (ES-3/ES-6 or ES-8) to control a few noisy Ieaskul F. Mobenthey modules along with assorted supporting modules (my modular grid wishlist). In addition to using direct audio analysis I plan on incorporating some drum triggers/sensors into the setup (KMI BopPad and Sunhouse Sensory Percussion) so this will be a whole new area to explore in the future!
Since Patrick works extensively with Myo controllers, I got to test test his out in the context of my setup, and ended up buying two used ones while in Montreal! At the time I was looking for a way to bridge the binary start/stop word of The Party Van with the fluid/wiggly world of Cut Glove and thought the Myos would allow me to have continuous control over sampling and processing while not needing to let go of the sticks, which I find to be one of the trickiest things when designing a software instrument for use while playing drums. This is something I will develop further, but I am wary of introducing movement/sensor technology in a manner which is inorganic as a performer. For example, I wouldn’t want to make superficial use of gesture or even worse, detract from my ability to play drums while thrashing my arms about!
Another early idea that got scrapped, or rather, severely adapted, was the use of the DMX-controlled headlights on the physical drum (along with a few webcams as seen in the Montreal residency). Although this looks amazing, and is something I plan on exploring further (possibly in an Everything. Everything at once. Once. context) it significantly hindered my ability to play feedback snare, as in my .com pieces. As you can see in the video, the surface of the drum is a mess! This not only makes it difficult to physically move around and play the drum, it also muffles the drum, which drastically limits the ability to create feedback using the drum.
Although I was aware that the drum/head had a big influence on the feedback I was able to get, primarily when I play with lower frequencies, I wasn’t so acutely aware that muffling the drum would limit me to only open-air microphone feedback tones. This still allows for expressive playing, and thinking about it now, my early experiments with this playing technique relied primarily on that kind of feedback. That being said, being able to use different resonant nodes of the snare gives me access to a whole other set of partials, in the same way that trumpet valves expand the range of the trumpet beyond that of a bugle.
This was especially evident in the Montreal residency when I discovered that by using crotales, which are surprisingly heavy for their size, I was able to play with resonant nodes of the drum by dampening the head in certain ways. Many years ago I had seen something like this happening with a gigantic concert bass drum using an interactive amplification system called the Feed-Drum. I didn’t think that would be possible with my setup as I had a much smaller drum and a passive amplification system. In the following video you can see some of the interactions I was able achieve with the crotale(s) (the slides/bends are particularly crazy!)
Something that I talked about in the blog post about the studio recording of iminlovewithanothergirl.com is how this performance technique continues to surprise me. The premise is so simple (friction and feedback using a movable mic), but the range of sounds is extraordinary, and seemingly endless. The introduction of crotales, and their ability to tune the resonant nodes of the drum, is another example of this.
I remember during the recording session for ialreadyforgotyourpussy.com with Richard Craig (for his fantastic album Amp/Al), I commented to Richard how our techniques and approaches to producing feedback were so fundamentally different. That through the opening and closing of keys, changing his and the flute’s physical position, and changing the opening of his mouth, he was able to shove the feedback around into different harmonic series, something that I, at the time, was unable to do (you can see Richard performing in this manner in this live performance). I was limited to standing waves of the harmonic series, and a few low partials/subharmonics of the head of the drum. Interestingly, this ‘standing wave’ approach is incredibly susceptible to interference from other instruments, the exploration of which is my favorite aspect of the recording for ialreadyforgotyourpussy.com, but incorporating crotales in this way allows me to play with a much broader range of frequencies/sounds.
Since the crotales worked so well in the Montreal residency, I decided to play with more metal implements in Huddersfield. Enter the stainless steel ruler. I’ve had a ruler in my stick bag for years, and have developed a lot of techniques around bowing/sliding it around the drum, but I never thought to use it with a microphone until now. I found that using the ruler with the microphone opened up whole other words of sonic possibility, primarily since gave me access to another physical surface/plane to experiment on. I could have the microphone on (or near) the head, and now on (or near) the ruler, using the ruler’s very different resonating properties to filter sound through. There’s also an additional violence to the sound of metal-on-metal, using the rim of the drum, or metal-on-head, which the microphone really exaggerates and brings out.
During the residency in Huddersfield I often found myself switching between my feedback microphone (a Naiant X-X omni) and my general amplification microphone (a DPA 4060). Normally I performed with either one or the other, either with feedback or live sampling, so this isn’t an issue, but with this project I’m doing both. On its own that’s not really a problem, it’s more the fact that each approach has a different signal path required, and the way The Party Van is setup, that’s not really possible. So I tried to imagine a way to make changing between the two mics, or any other quick setting changes required for each section, easier from the drums. This is something I was hoping the Myos would bring to the table, but that would be overkill for something this simple.
I’ve been looking for really compact MIDI controllers for a while, for use in various projects, and hadn’t come across anything especially useful. While thinking about the problem of fast/quick control from the drums I remembered seeing DJ Sniff perform using Novation Dicers some years ago and thought that the curved shape could work well right up against a drum since they are designed to fit right on Technics 1200 series turntables. After I tracked some down (no local shops carried them to simply try out), I found they fit perfectly!
I’ve not yet incorporated the Dicers into my setup as I want to create a 3D printed mount to hold the bottom of the dicer in place against the drum. I’ve wanted to get into 3D printing for a while, but haven’t really had a need for anything yet, so this project gives me a reason to look into it. And thankfully, Novation has a programmer’s reference document which lets you control the colors and brightnesses of the Dicers’ LEDs, as well as enabling the sending of all button presses as by default some of the buttons change internal modes rather than sending MIDI data. (click here to download a simple patch I made for parsing button presses and controlling LEDs on the Dicers)
I think the Dicers will make a very welcome addition to my drums + electronics playing in general, in that having such close proximity to controls, with decoupled LED feedback (something I’m now addicted to thanks to my trusty monome controllers!) will let me engage with the electronics without having to stop playing or move away from the drum itself.
So that is the Black Box project! We have another residency, along with our first performance, coming in January 2017 in Montreal, which will be followed by an additional residency in Mexico (which I’m really excited about!) later in the year. I look forward to seeing how this develops further as it is an exciting project to be involved with.
Leave a comment
-makes music and art
-lives in Madrid/Manchester
-is a crazy person
- 28 Jan: NUStival - Manchester
Learn from me (for free!)
Want in on this?!
Read my PhD Thesis!
and Making Things,
sitting in a tree :