Behind the Artwork: Ben Bloomberg Creates Live Performance Systems for Virtuoso Multi-instrumentalist Jacob Collier

CAST Visiting Artist Jacob Collier is well known for his YouTube videos, which the 22-year-old musical phenom performs and produces in his bedroom in North London. Collier’s work has garnered praise from such jazz icons as Herbie Hancock and Quincy Jones (who is now his manager).

When Ben Bloomberg, a PhD student in the Media Lab in the Opera of the Future group, first saw Collier’s videos, he was likewise impressed, and promptly wrote to Collier to offer his assistance. In 2014, Bloomberg began working with Collier to transform his video productions into live one-man-show performances. Bloomberg also designed and created a customized vocal harmonizer, an instrument that allows Collier to simultaneously combine his vocal and keyboard talents.

Fred Harris, Director, Wind and Jazz Ensembles and Lecturer in Music, who spearheaded Collier’s MIT residency, describes Bloomberg as, “the unsung hero” of the the highly anticipated concert, “Imagination Off the Charts”. “He has a personality like Jacob. He’s the sweetest, most humble guy. And he’s brilliant. He has been instrumental in making all of this possible, because any time we have a crazy idea to bounce back and forth, he’s got a direct line to Jacob. They’re friends, which makes everything smoother. So he’s the glue, or the bridge,” says Harris.

Bloomberg specializes in the design and implementation of advanced surround sound and audio systems for large venues, but he has created many kinds of live performance systems, ranging from musical instruments to architectural lighting. Since entering MIT in 2007, Bloomberg has worked with Professor Tod Machover at the Media Lab, touring internationally to design and support technology for Opera of the Future productions.

Having engineered hundreds of shows in venues all over the world, he has worked with many artists, engineers and designers. From Björk to Ariana Grande, people seek him out when there is a challenge that requires novel and highly customized solutions. Bloomberg works closely with artists to build systems which push the limits of performance and expression. He aims to create technology which brings the performers closer to the audience instead of diminishing them in the presence of large screens or sets.

Bloomberg spoke with CAST about his dual interests in music and technology, creating scalable systems and custom instruments for Collier, and what we can expect from them in Collier’s upcoming concert at MIT on December 10.

 

Ben Bloomberg, "Imagination Off The Charts—Jacob Collier at MIT," Kresge Auditorium, 2016. Photo: L. Barry Hetherington.
Ben Bloomberg, “Imagination Off The Charts—Jacob Collier at MIT,” Kresge Auditorium, 2016. Photo: L. Barry Hetherington.
Ben Bloomberg, "Imagination Off The Charts—Jacob Collier at MIT," Kresge Audotorium, 2016. Photo: L. Barry Hetherington.

 

Interview with Ben Bloomberg 

 

What was your first impression of Jacob’s videos? How did you begin working together?

I felt connected to his music. He grew up listening to all kinds of different music; I grew up listening to a lot of the same music coincidentally. The stuff that he was doing musically was just so incredible.

In our lab, we work with a lot of Broadway artists, Hollywood producers, classical musicians and orchestras all over the world, and to see somebody at his level at his age—I just got really excited. It is totally amazing to have his chops curatorially, to assemble these videos in the way that he does, and with the tools he has—a lot of those videos he made at the beginning were with a single microphone, and not a great one.

He has a good ear and a good eye for putting musical and visual elements together in a profound way. I was totally blown away that somebody who was 16 or 17 made those videos. So I sent him a message, which I’ve done with a few different artists. I’ll just email and say, ‘Hey, I don’t know you at all, but I think you’re awesome. Let me know if I can help in any way. I’ve done some music tech stuff.’

And then basically, it was perfect timing, because he had just been asked to do this massive performance in front of 4,000 people at the Montreux Jazz Festival, and he didn’t know what he was going to do. He had all these crazy ideas involving technology. It was just luck that I sent him a message that same week. He responded, essentially like, ‘Oh, yeah, we need that.’

One of the things that I really get excited about is trying to figure out how to design these systems so that they stand up to the test of a live show. And there’s a lot of research going on around MIT where it has to work in a lab, or it has to work for a demo, but to do something for a touring performance is so completely different. I like that puzzle.

 

The first time you collaborated with Jacob was for his first live solo show at the Montreux Jazz Festival, where he was on a bill with Herbie Hancock, Chick Corea and Melody Gardot. Did you have to be there to run the tech? Could you talk about how the system you designed for that show has evolved?

Yes I had to be there. It’s a constant process of refinement. At the beginning, we designed the system so that it was the most robust and reliable, but using very expensive components. And that’s OK, because those big festivals have a lot of equipment that you just take for granted. So they have a multi-million dollar sound system that we don’t have to worry about. And it’s got a gazillion knobs and buttons, so we can spread all our control interfaces out over all the buttons and have it to be easily manageable.

It was all so new and untested that I had to be there to make everything function and solve issues. We were up until 4am the night before the concert finishing the software. Over time now, we’ve significantly condensed and simplified the setup, and we have crew who know how to run the equipment—so I don’t go out anymore for every show.

We recently had a show in Central Park [SummerStage performance, with Jacob Collier opening for Kamasi Washington]; where previously there was a mixer with a gazillion faders, now we have it down to just twelve. So it’s much smaller and cheaper. The last tour we went down to eight faders and some iPads. And it was just the two of us—no crew.

 

Could you describe the custom vocal harmonizer you created for Jacob? Does it sync to the video as well as to his vocal tracks?

Yeah, exactly, everything is connected and talking to everything else. So for example, when he plays a note on the harmonizer, the harmonizer takes whatever he’s singing and the keys he’s pushed down, and makes whatever notes he sings come out as those notes of the keys played. Then for every key he pushes down, it sends a message to the video system, and creates a cutout copy of his head on the screen. So he can generate a virtual video chorus with up to 12 heads singing the different notes.

That was actually one of the first things we designed together, because there are a lot of devices on the market that only have four notes you can play at a time. The first thing we did was to build a harmonizer that could play twelve voice at the same time instead of four. Most people don’t know what to do with more than four notes at a time, and he wanted twelve- that’s some pretty complex harmony. Additionally, we wanted a very specific sound, and that was what dictated a lot of the hardware and software in the device.

Then, to be able to design the hardware so that he can take it on a plane, and can be thrown around in the checked bags, involved all these extra layers of design and refinement. And we’re still working on it. It’s been really tough. That thing takes a beating. I recently went to Denver [to meet up with Jacob’s tour] to tune it up and get it back in shape.

 

You’ve said you aim to use the technology to connect the performer to the audience—to enhance rather than detract from the live experience. How do you achieve that goal?

It’s a fine line. One thing that’s amazing about Jacob is that you see a lot of pop artists, or tours, using all this tech sort of as a crutch. Someone may say, ‘I can’t sing perfectly in tune.’ So I’ll autotune everything. Whereas with Jacob, all the technology that we’re making actually allows him to do more and be even crazier than he could be without it—he can’t play seven instruments at the same time because he doesn’t have enough arms. But we can try to give him more arms—we design these systems that actually allow his abilities to shine through.

I think keeping him in the center of everything, keeping people in the center of things, is huge. In a lot of big shows, the technology is actually upstaging the people, because there are timers, click tracks and the humans have to stay synchronized to the tech. The musicians can’t perform naturally, which is sad, because that’s the most moving part—that’s where the emotional connection is. When as an audience member, you feel an emotional connection, it comes from humans being human rather than following a click track.

 

So, are you making the technology more responsive in real time?

That’s the goal. And traditionally, there’s this huge trade-off between having a lot of flexibility to be yourself on stage and having a lot of really complicated things happen at the right time. And so the more complicated and synchronized you want the performance to be, the less you can be a real performer onstage, the more you’re going through the motions.

What we want to do, and what I really like to do here in the lab, is create systems which allow both of those things. So it’s a complicated set of synchronized, choreographed systems, and events, and experiences, but it’s actually tied back to the human performance. It’s not on a timer. It’s not where somebody pushes play. It’s using sensors. It’s using interfaces that are so nuanced with instruments to extend the performance and the expressivity. And that’s what Tod’s been doing in our lab for 30 years now, and what I am also very passionate about.

 

What can we expect to see in Jacob’s upcoming show at MIT on December 10? Will it be different from what he’s done with you previously?

Yeah, the goal is to try to do something new and different. We’re trying to figure out how crazy we can be, and still make everything fit together.

Tod has been amazing and really supportive. Fred’s been totally amazing too. If it weren’t for him, none of this is would be happening. He’s been really spearheading the whole project and putting in just a huge amount of work. And everybody in the community is so lucky that he’s willing to work so hard.

We’re thinking about ways that Jacob can play the ensemble sort of like an organ where they’re improvising in real time reacting to things that he’s sending them on mobile devices. The idea essentially is to allow him to play a large ensemble like he would play the harmonizer, like he would use the looping system. But we have a million crazy ideas. There’s unlimited things that we want to try to figure out how to do. As of now he can send fragments of music and articulation to the phones of everyone in the ensemble. We have to play with it to see what that will enable musically.

 

Is there ever a plan B?  What do you do if there’s a technological fail or a power outage?

It’s all we think about, especially with shows. And again, this is even more the case in the big arena stuff like Ariana [Grande]. You have to think about the probability that everything’s going to fail. And if you add this redundancy, then what is the probability it’s going to fail. We’ve had cables go. We’ve had the jack or some instruments go. We’ve had the computer crash. We’ve had all kinds of things.

That’s really where it’s not the same as doing something in the lab—because there are 90 minutes when everything has to behave. All the computers have to do the correct thing for the same 90 minutes. That part of it is exciting to me. It’s all about thinking ahead. That’s a really fun puzzle.

 

You’ve been running sound since you were nine years old. Did your interest in technology and in music develop simultaneously?

Yeah, it was both music and technology from the very beginning. My grandmother was a music teacher in our elementary school. So that’s when I started getting into this stuff. At first, it would be things like dubbing tapes over. While getting into the tech side of it, I was also learning instruments, singing and playing.

My elementary school got a new sound system, and I was really interested. Nobody else bothered to learn how to use it, so I was the one setting up the sound for everything that happened at my the school. That was fourth grade, so I couldn’t lift the heavy speakers myself and had to ask for help reaching the plugs on the back.

In high school, I really started experimenting—taking existing systems and reconfiguring them. That’s when computers had just become fast enough to be used for realtime control of audio, lighting and video equipment. Previously, there were only simple proprietary systems, which were these proprietary black boxes. None of them could be connected together or tweaked in any way. As soon as the computer was involved, you could really start to choreograph many complicated pieces all at the same time.

 

When did you start thinking about this line of research as something you’d pursue professionally?

From a pretty young age, I always was interested in setting these systems up both for performances and music. When I started looking at colleges, I was really interested in finding programs that had a strong computer science department, but also strong arts and theater programs. That arts component was specifically what I was looking for back then, and I also wanted to be somewhere where I could combine that with math, computers and science.

There weren’t many places in the country that did that. Before I really knew anything about MIT, I discovered the Media Lab. The way people were combining everything— tech, art, social experience, design— in a such disobedient way, it appealed to me very much. They were really putting things together that were never meant to be mashed up in the coolest ways. Kind of like theater and computers.

In fall of my freshman year, I found this specific research group, Tod’s group, [The Opera of the Future], and started working here. I have been helping Tod put shows together since 2007.

 

What was it like working with the Opera of the Future Group as an undergraduate?

I majored in computer science and electrical engineering. MIT has a UROP program which allows undergrads to help with research in graduate lab. I started a UROP in Tod’s, lab [Opera of the Future] my freshman fall. I would take a lot of the concepts I was learning in my other classes and apply them to projects here in the lab. So I might build a surround sound decoder for a class and then bring it here to use for a show, or build some software and test it out on a stage somewhere.

The amazing thing about working in this group, especially with Tod, is that the performances are real. We’d go to England and do five sold out shows with 1,700 seats and a full orchestra. So this stuff has to work, and it’s tested in the real world, which is very challenging, but cool.

Are you involved with the jazz ensemble and other musical groups on campus as well?

Yeah, I have always loved playing music and learning music. I spent about 8 years in various music groups on campus. These days I perform less, but I think to build the technology in a useful way, it’s good to have an understanding of how people will feel when performing with the technology. You can’t get that sense without doing it.

So, when you’re not on tour, you’re working toward your PhD. What is your research?

I’m really lucky that this work with Jacob is research for me. In our group, we’re always looking for the next generation of experiences and exploring this idea that all kinds of mediums are crashing together. You don’t often just sit down and listen to something with your eyes closed anymore. You watch YouTube, and there’s video, and you might add other senses to that like taste, and touch, and smell, and environment, and all these things.

We think about what it means to choreograph all those senses, and score them, and arrange them, both from a logistics standpoint, and from a creative standpoint. Jacob presents me with an opportunity to test out those ideas. By going on tour with artists like Jacob, I gain a lot of useful knowledge.

 

This is the second post in an ongoing series, “Behind the Artwork,” in which MIT researchers who worked closely with CAST Visiting Artists share the personal stories, scientific insights and technological developments that went into developing their collaborative projects.

Posted on December 6, 2016 by Sharon Lacey