Episode 298: Adam Stark: Turning Movement into Music with MiMU Gloves

LISTEN TO THE EPISODE:

 
 

Scroll down for resources and transcript:

Adam Stark is the Managing Director and Head of Technology at MiMU Gloves, the groundbreaking wearable tech company he co-founded with Grammy-winning artist Imogen Heap. With a background as a computer scientist, researcher, and musician, Adam's work focuses on bridging human movement and digital sound, enabling artists to control music and visuals using expressive hand gestures. MiMU Gloves have been used by performers around the world—from global pop stars to avant-garde creators—revolutionizing the way music is performed and experienced.

In this episode, Adam shares how wearable gesture-controlled tech is reshaping musical performance and expression.

Key Takeaways:

  • Discover how MiMU Gloves turn physical gestures into precise musical and visual control.

  • Learn why transparency and expressiveness in tech-driven performances matter more than ever.

  • Explore the exciting potential of neural interfaces, AR/VR, and the future of immersive music creation.

Michael Walker: Yeah. Alright, I'm excited. Excited to be here today with a new friend, Adam Stark. So Adam, he's the co-founder and Head of Technology at MiMU Gloves, innovating wearable tech that allows musicians to control sound and visuals using hand gestures. He's an expert in music, technology, and human-computer interaction.

The PhD focusing on intelligent musical analysis for live performance. He's a collaborator with artists and technologists. He helps develop tools that enhance expressiveness and creativity in music. So I'm excited to have him on here today to talk a little bit about MiMU Gloves and the tools that he's built for artists to be able to express themselves in innovative ways, and looking forward to hearing his perspective and thoughts on where things are headed right now with technology and what the future of creativity looks like.

So Adam, thank you so much for taking time to be on the podcast today.

Adam Stark: I'm pleased to be here. Yeah. Very nice.

Michael: Awesome. So yeah, maybe we just dive right into it. I'd love to hear a little bit about your background and sort of creating MiMU Gloves and how it was inspired, and what was kind of like the biggest reason that you created the company in the first place.

Adam: Sure. So yeah, my background is in—so I'm both a musician and a computer scientist, and I kind of studied computer science, but it was kind of myself at the same time.

And I then merged those two things. I did my PhD in effectively using computers to learn about live performances and how computers can respond more intelligently and sensitively to artists in live performance.

So can they understand what the beats are and what the chords are and actually interact with musicians in those ways.

And a little bit later on—so the MiMU Gloves themselves—I'll just sort of say what they are to begin with. They are these wireless, wearable, gestural musical instruments and performance tools for artists.

So the idea is really that they allow artists to control electronic sounds with the same degree of expression that they can control acoustic instruments, right? So they've got all of the sort of physicality of acoustic instruments, but all of the amazing power of electronic sound.

So that’s what they are. But to sort of explain then how they came about—they came about from an artist called Imogen Heap. She's reasonably well known. It sort of depends where you're—

Michael: I've seen the glove. I've seen videos. Really cool. Wow. So that's MiMU Gloves?

Adam: Yeah, that’s right.

And so she—this was about 2011. She was—if you've ever seen her live, she's got loads of different things on stage. She’ll have loopers and synthesizers and piano and several other instruments.

And she really wanted a way to be able to access all of this multi-instrument setup that she has, but also in a really kind of expressive way.

And she had this idea about the gloves and she started to pull a team of people around her, starting with a guy called Tom Mitchell at the University of the West of England. And they worked on a sort of prototype, and then that went well and Imogen started to bring a team together—really varied team of textile designers, electronics engineers, computer software people—that’s sort of my specialty.

And this really kind of diverse team came around and we built this first pair of gloves for Imogen, specifically as an artist. So that's where it came from.

Michael: Wow. Super cool. The first thing my mind goes to is—we interviewed Nolan Arbaugh on the podcast a few months ago. Nolan's the first Neuralink patient, so he's quadriplegic and he has a brain interface installed.

He was on the podcast. We created the first song telepathically using his Neuralink and AI.

And what I'm hearing you say right now with the gloves is that with acoustic instruments, we have the ability to express at a very high level of articulation with our hands because we have so many neurons devoted to our hands.

And so you've been able to map an instrument to allow artists to actually express—and probably map these parameters to these gloves so they can bring that level of articulation to electronic instruments.

Adam: Definitely. And I also like to think of it as—the musicians are one kind of group of stakeholders in this relationship, if that makes sense.

But the audience is the other one, right?

I don't want to lay into electronic music tools because loads of them are amazing, but most of them are very tabletop—look down at some buttons and sliders. You're operating a machine; you're not performing.

So for an audience to see somebody physically really interacting—the way we look at a drummer play or the way that we look at a cellist play—for an audience, it's great to see somebody physically interacting with sound rather than operating some kind of machine on a surface.

Which is cool, but it's also been done.

Michael: Yeah, I mean, it kind of makes sense. When you think about expressiveness itself, and what are the parameters through which humans can perform and express themselves, voice certainly feels like one of, if not the primal one.

You can really express so much through your voice, but it does seem like a full-body type of expression.

Your hands is a great place to focus because it really is where—I’ve seen images that show a picture of how our body parts are actually sized in proportion to the amount of neurons that we have.

And our hands are massive. Our faces are big. And then the rest of it's just like—there’s very few neurons devoted to it.

So I’d imagine that by mapping an instrument specifically to the hands, you can really unlock a lot of expressiveness through it.

Adam: Definitely. And artists are doing this with their hands anyway—particularly if they're singing.

Your hands are already there being expressive quite a lot of the time.
 So often it’s about just capturing that sort of expression and allowing people to use it to do interesting things.

Michael: Cool. I'm not gonna lie, as soon as you described the concept of what you've built, I’m like, how can I buy this?

Is this a product that's for sale? Is it custom built? How do these gloves work?

Adam: They are for sale.

We are 12 years into this from the first gloves that we made for Imogen, and now we've just released the second design of these gloves.

They're available now—we launched them literally about a month ago. Wow.

So we're going to be shipping them to the first set of artists who want to use them later this year.

Yeah, we are really excited about that. It’s taken years of learning that we've now put into this new design, and we are really proud of it.

It's awesome. There are going to be so many different artists doing different things. It's going to be great.

Michael: Cool. Well, congratulations.

Yeah, I'm sure a lot of energy and love and resources went into creating, you know, this next design. I, outta curiosity—so I'm wondering how the gloves map, like on a software basis. I'm assuming that, well yeah, I'm sure that you sync these up with an app on the computer.

Is that linked to like, is it like a plugin? Or like, how does it work in practice?

Adam: Yeah, so the gloves measure the bend of your fingers and the orientation of your wrist, and then they send all of that data to the computer by Wi-Fi. The software that receives them is our own piece of software. It's called Glover.

And this basically is a creative tool for gesture. The idea is that each musician can connect their gestures on the one hand to whatever sound or synthesizer or audio effect or whatever it is they want to control on the other.

And they can make these connections. So it's really a kind of mapping tool for making these gesture-to-sound connections.

They can make collections of them as well, save those collections, and then make another one and switch between them. Wow.

So this really gives the artists the power to decide what each of their expressive movements does in terms of sound control.

You can imagine another world where we decided it would be a simpler version. Every time you make a given gesture, it would always be, that would always be increase reverb or whatever, and it would be really cool. But then it would be incredibly boring straight away, because once you've seen it, you'd sort of seen it.

But now with this tool, it allows the artists to basically define whatever gesture-sound relationship they want.

And very often, if I see artists performing, I have no idea how they're doing what it is they're doing on stage with the gloves, because they've got their own kind of mappings between their gestures and whatever kind of kit they're using—on the other hand, whichever software or music hardware or whatever.

Michael: Mm, interesting. So, am I hearing you right that you're saying that with these gloves you can create custom mappings, so that you can really assign it to how the gestures map to things on the computer? And are you saying that as well—like, does it usually come with one glove, or can it potentially come with two gloves? And then you can use one glove for like context switching, and then the other glove for actually mapping the parameters? Or how does that work?

Adam: Yeah, well, you combine the gestures for both. So one can be kind of like a switch for the other. You might say, “Oh, I want to do something with the roll of my wrist like this when I pinch. But if I'm making a fist, that's going to do one thing, and if I'm doing a one-finger point, well, this is going to do another thing.”

It can basically—you can build these layers of contextual control over your sort of synthesizers, your audio effects, and so on.

Michael: Wow, man. I could imagine that functionality being really useful for expressiveness in music performance, but also just like gestures in our day-to-day life.
 In terms of smart devices—obviously we have voice mode activation for some things—but not that level of customization of parameters, where you could actually interface directly with your devices just by creating a specific gesture. Especially when I think of sign language and things like that—it seems like that's a really powerful use case.

Adam: No, definitely. And I think that one of the—obviously there's that—it means that to create something good, there is some setup time there, because you have to be experimenting with these gestures and you kind of have to—it doesn't take very long—but you need to learn the muscle memory of that.

But once you've done that, the whole software thing disappears.

And you are able to just really use this muscle memory to access all of this stuff, right at your fingertips.

Michael: That's really cool.

I'd be interested to hear your perspective as an expert at mapping gestures to expressions—and mapping human creativity to computer expressions.

Coming back to that conversation we had with Nolan, around how—we did a very primitive version of, when we say "created the song telepathically"—it was really just him using his

Neuralink to go to SUNO to generate a song.

Which is very cool. Very cool.

But it's a little bit different than mapping his Neuralink to all these different parameters and literally creating a sound just with his Neuralink.

I'm curious if you think there's a world where we have that level of binding between something like a neural interface and the tools that you've built.

That an artist could literally just imagine a symphony, and then play a symphony with their thoughts.

Adam: I mean, I do think that's an incredible application. And I know that also, as you've described—for some artists, this is potentially their only way to make music at all.

So I do think it's incredibly impressive. But I also—our focus is kind of largely on trying to make what we call "transparency."

This is sort of what it's known as in the—it's the transparency of action and result, if that makes sense.

One of the problems with a lot of electronic sound is that stuff is happening and it's coming out the speakers, and you can't really see what has caused that to happen. Is it just a recording? Or am I watching a live performance here?

And what we're trying to do with the gloves is to give people a way to make the consequence of their actions incredibly transparent and explicit, so people can see that they’re watching something being created in front of their eyes.

And that's not to say that a brain-interface-type approach is somehow invalid or not worth it. But the focus of our work really is on trying to make that clear. And I think with the brain interface, that becomes quite hidden away. But again, it has its really important place.

Michael: That absolutely makes sense. Yeah, maybe we can just hack into their brains—sorry, dark thought.

It definitely brings up interesting questions around that type of technology and the ethics and security risks and whatnot. But I guess we'll cross that bridge if and when we come to it.

So yeah, I guess maybe in a world like that where you could control things, to visualize it, you would want to have some kind of interface where you could also visualize their brain. Or you could see what they're thinking about.

What I'm hearing you say is that one of the best parts about this tool that you've created is that it not only allows you to control those parameters—which, often, electronic artists are doing amazing things—but when it's not visible, or when you're not able to portray that in a way that the audience can see, then it kind of loses its impact. It loses its effect, because we don't see anything happening, and they don't know if it's recorded or not.

Whereas with these gloves, you have the ability to actually perform and actually portray and express with your full body in a way that brings those things to life—like they weren’t able to have before.

Adam: Yeah. I mean, so yeah, on the—I think on doing the sort of brain interface thing, I think the way you described it is great. To make that performative in the way that we're talking about with the gloves, you would add some sort of visualization.

I think that’s how I would approach it. But yeah, it is about showing people that something's being created in front of their eyes.

And this was a thing that we've always slightly had a problem with, with the gloves—is that when someone's performing with it really well, people are like, “Oh, okay, this is interesting. This is cool.”

But when it breaks, they go, “Oh wow, this was actually real. Like this is actually happening.”

And they understand that actually everything they’ve been seeing was true, if that makes sense.

So the sort of paradox was that it was only when it failed publicly that people realized it was actually happening and not just somebody dancing along, if that makes sense.

Michael: Yeah. Yeah. I mean, it's almost a little bit like using the force—you know, like your hands. And it's like, yeah, like if you're using magic or something, then it's almost invisible or intangible. So that moment where you can actually see there's a disconnect there, that makes sense.

That is an interesting idea—even if it's not a brain interface—having a visualizer that shows all the mappings of what parameters are being controlled and how they move in a space, like a 3D space.

I don’t know if that's something you guys have explored or if there are creative ways people could do that with something like Blender—if they also mapped the parameters not only to the music or the audio but to a 3D render or something that moved in space along with those same properties.

Adam: Yeah. No, we’ve got several artists who’ve done exactly that.

Michael: Wow.

Adam: Cool. So they have all of these projections behind them, and they’re able to control all of the visualizations behind them, as well as the music that they're making at the same time.

And other artists have connected them to lighting and to LEDs—all kinds of different applications.

There’s definitely all of this possibility. We've sort of left it really open as a technology so that artists can do anything they want with it, really. And we give them the tools to do that, and they can kind of run with it without us being prescriptive over exactly what it does.

Michael: Wild. That's so cool—being able to map not just the audio, but an actual lighting show or the whole stage. Like basically anything that can be, like, have a parameter.

So could you remind me—you talked a little bit about the interface between, um, you have a Wi-Fi connection between the gloves that goes to the computer, and then is it—it's like an app or is it a plugin? Or how does that interface with the other applications on the computer?

Adam: Yeah, so our software on the computer receives all this Wi-Fi data from the glove.

You can then connect your gestures to, let's say, a MIDI message, for example, which can then be sent to Ableton Live, or it can be sent to your hardware synthesizer or whatever it is.

So our software just appears like a MIDI interface.

Michael: Cool.

Adam: Or you can connect to something called an OSC message, which was sort of invented in the ‘90s, I believe.

And it's far superior to MIDI but has failed to replace it.

But it’s like—you can send it wirelessly, you can send all kinds of messages, and it’s really useful to control things like visuals or other technologies that need to be interfaced with.

So you can write some custom software if that’s the kind of thing you want to do.

Michael: Cool. This is kind of top of mind for me right now. We just built out a live production studio for broadcasting, and so I've been exploring tools like Bitfocus Companion and mapping things to control the videos and the recording gear and the setup.

So, I mean, this idea of being able to map the gloves and the gestures to different things you can do—I’m even imagining potentially using that for a live production tool for the cyclorama room that we have downstairs for hosting these virtual events.

Adam: Cool. We've definitely put our emphasis on live performance to begin with.
 I guess it kind of makes sense because there's a big visual aspect with it.

But there are—I mean, I was talking to a recording engineer a week ago, and he's really keen to use something like this in his studio to do some really expressive automation lanes or to be able to even just interact with his DAW in a more kind of natural way, if that makes sense.

So there's definitely a recording context as well, I think.

Michael: Hmm.

Adam: Awesome.

Michael: I'm curious to hear your thoughts—I know you've been really hyper-focused on creating this amazing product, and I'm curious, as we zoom out and look at the future of where things are headed, what your perspective would be on the future of performance or the future of creativity.

In particular as it relates to, I mean, like VR and 3D as well. What are your thoughts in general about the future of expressive performance for artists?

Adam: Yeah, I mean, venturing into 3D space and the idea of virtual environments—I think there's a lot of exciting potential there.

It's obviously—there's this disconnect now slightly between the push to get these headsets out to people.

I have one myself and I think it’s incredible, but the uptake is still not—it's still not there enough so that if you do your online performance, you normally don't quite have a critical mass of people using it.

Michael: Yeah. It's like right on the—feels like we're—it's not quite there yet in terms of adoption, but you gotta imagine that very soon.

If you had to guess, what would you think is the timeframe for when it’s as common as us having a cell phone device?

Adam: Well, without showing my age too much, I remember first trying VR back in the mid-‘90s in the—I think there's a big area called the Troc in London here.

And that’s now, you know, it's 30 years later.

But at the time, you were like, “Wow, this is gonna be—surely within five years we’re all gonna be doing everything every day with this.”

I wonder if, like all good technologies, it isn't going to become replacing all our interaction with everything—purely for the reason that it's so isolating. Once you cut yourself off from the current world that you're in, that kind of makes it very difficult.

So until that problem is solved, it's quite difficult to go in and out of that world in the way that I can with my phone. I can just—I'm here, I'm interacting with someone, and I’m back again. I can even do that in the middle of a conversation with someone, if that makes sense—which is rude, but people do it all the time.

So I think that for people to enter this space, that might have its own world, and it might be a bit more niche than we think it's going to be.

I think it'll be incredible, but I think we're still waiting to get to that point where that kind of environment is desired as much as people want to have with the way they interact with their phones or the way they watch films or other media that is more pervasive in that way.

Michael: Hmm. Makes sense. It's super interesting. I mean, I try to kind of just take stock every once in a while when I'm doing something like this and we're having a conversation remotely.

We have this little portal on our screens that's magically, you know, cyberspace-tunneling across the world. And somehow, like magically, we're communicating in a way that was like witchcraft to someone a thousand years ago.

What I'm hearing you say is that with this type of technology, right now there's not this mass adoption. And also, as this technology becomes more pervasive, it also might not be as desirable as just spending time in the "real world" and disconnecting.

I mean, there are a lot of negative consequences or downsides of spending too much time right now in cyberspace or on the internet. We can certainly lose ourselves in it. It can disconnect us from nature and from each other in our communities.

But it is interesting to explore—maybe part of the issue is that we don't have the presence quite there. Most of the time that we are spending on the internet isn't like this. I mean, I feel like this is high-quality connection time on the internet because we're here together, we're sharing space, we're having conversation.

This is some prime connection internet usage. But most internet usage—like 99%—is people posturing on social media and wanting to create an impression on their ego.

And that feels like maybe more of the issue. This feels amazing, like we're able to connect in ways that just wouldn't have been possible before. And it's the same thing with social media too, but—

Adam: I think the whole idea of virtual—if you can—the problem, again, you're wearing this—

But if you could somehow transplant yourself into a virtual space, which I know some of the developers are working on—

Michael: Yeah. Or like Neuralink. You go into those questions again about who do we trust with a chip in our brain too, but also—

Adam: But I think one of the issues is that technology is good when it answers a question in society. And one of the things with VR—VR is undoubtedly incredible technology.

But it's not yet—it's been pushed always from the technological side. And I think we haven't quite got to the point where it's filling a need in society. Until we get to that point, it’s probably going to remain a slightly niche, fringe thing.

So that connection needs to be made. You can see why a phone answers a thing in society, as well as being an incredible piece of miniaturized technology.

Michael: Yeah, that's a great point. Like, what's the function? Right now the function is less of a need and it's more of a nice-to-have or it's like, cool, or entertaining.

But how do we value entertainment or better experience when the actual function of what we need through communication we're able to do through the devices we currently have? Yeah, that's a good point.

Adam: Yeah. I think there's also—I mean, I'm really going the whole hog on this—but there's also a tendency from any technology company to explain that the thing they’re building is going to be pervasive and absolutely everyone in the world is going to have their device or need that in their life.

And actually, it's kind of okay to build something that's really cool but quite niche.
 I mean, it's difficult if that has a multi-billion dollar upfront investment cost, but—

So, we, for example, when we're talking about the gloves to musicians, people sometimes ask me, “Oh, is this going to replace all music controllers?”

And it's definitely not. Music technology is brilliant because it's a pluralistic, varied space. We’re probably always going to be something that's quite niche for a specific type of musician. And that's, I think, quite an exciting place to be.

I like having a conversation with people on that basis, rather than trying to explain to every musician that they need to sell all their gear and buy our product.

Michael: Right. That makes sense. If you're trying to replace everything—including playing guitars and piano—you’d have to figure out the mechanics of when you feel the resistance from strumming. That's an important variable.

And same thing with playing the keys. Replacing that just—yeah. There's no—

Adam: There's better places to put your effort than trying to replace something as cool as a guitar.

Michael: Yeah, makes sense.

Alright, well, I'm curious, now that you've officially launched version two of your product and it's getting out into the world, what are you most excited about in terms of what's next for you? What's the next big thing that you're thinking about—you're like, “Oh man, that's going to be awesome,” and that you want to work on now?

Adam: A big motivator for me is just seeing artists—what they do with them. Because they always do something with them that I could not have thought of.

That's always a great surprise to me. My favorite thing is just going to a show and seeing them do something with it.

As a little bit of context, when we finished the previous design, that was right before the pandemic. So we shipped out all of these first batch of gloves in version one.

We couldn't travel anywhere. We couldn't take these to shows and try them on people.

So there's been this big gap where we haven't really been able to connect with people. There are lots of them out there, and now people have been doing shows with them since the pandemic finished.

But this time, hopefully there's not an enormous pandemic about to hit us, and we can go and see lots of artists. That's definitely my number one thing.

Michael: Hmm. Awesome. Yeah, I bet that's probably an amazing feeling—kind of like giving birth to this product in the world, and now seeing what people can create with it, the magic they're creating. Being able to see the shows.

Awesome. Well, this has been a lot of fun hearing about the gloves that you made.
 And I personally am very interested in picking up a pair.

For anyone that's listening to this or watching this right now and is interested in learning more about the gloves or actually picking up a pair for themselves, what's the best place for them to go to learn more?

Adam: Sure. So you should go to mimugloves.com. That’s M-I-M-U-G-L-O-V-E-S dot com.

And we have—you can buy the gloves there, you can watch some videos, you can learn a bit about the history of them. There’s a whole bunch of nerdy documentation if that’s your thing. We have some audio plugins too.

So yeah, please go and check it out. And if you’ve got any questions, just send us a message and we’ll be happy to get back to you.

Michael: Awesome. Well Adam, like always, we’ll put the links in the show notes for easy access.

And thanks for coming on the podcast today. And thank you for the work that you've put into these gloves. It sounds amazing, and I'm looking forward to playing around.

Adam: Brilliant. Well, thanks for having me.