The Role of AI in Jazz

I think AI is having a genuinely good effect on jazz music!

There’s a lot of anxiety around AI in music at the moment, and jazz is no exception. In fact, if anything, the reaction from jazz musicians tends to be particularly negative.

I understand why. Jazz is deeply human music. It’s about interaction, identity, listening, and presence in the moment. The idea of something artificial stepping into that space feels, at best, uncomfortable.

But I think it helps to separate two very different ideas: AI as a performer, and AI as a tool.

As a tool, I believe AI is already having a genuinely positive effect on jazz—and on how we learn it.

In a previous article, I wrote about using AI for stem separation, and how useful that can be for practising musicians. Being able to take a recording and isolate exactly what a pianist is doing with their left hand, or hear clearly what the bassist is playing under a busy head, simply wasn’t possible before. If you want to play along with some of the greatest musicians who ever lived, properly isolated, it’s an extraordinary resource.

I’ve been experimenting further with this approach recently while working on Little Sunflower. For this piece, I used AI tools to remove the drums and reduce some of the density in the original recording, leaving a much more open space to play in.

What struck me immediately was how clearly the time feel came through. With less going on, you can really hear how “in the pocket” the playing is—and that changes how you respond to it. It’s not something you necessarily notice in a full mix, but once it’s exposed, it becomes impossible to ignore.

That kind of detail matters. It affects how you phrase, how you place notes, and how you listen.

Transcription is another area that has changed. The mechanical side of getting notes down is faster now, which means more time can be spent on what really matters—interpretation, phrasing, feel, and making the music your own.

Then there are AI accompaniment tools. Not everyone has a drummer and bassist available on demand, but time feel and interaction are at the heart of jazz. Being able to work on these things in a meaningful way, even when practising alone, is a real advantage.

Audio restoration is also worth mentioning. Older recordings, sometimes degraded or unclear, can now be brought back with a level of clarity that reveals details which were previously hidden. That’s not just a technical improvement—it’s a way of reconnecting with the tradition in a deeper and more direct way.

This is where I think the real value lies. These tools don’t replace the work. They don’t replace listening, or developing a personal voice, or the challenge of playing in real time with other musicians. If anything, they give us better access to the material that helps us grow.

The concern seems to come from imagining AI as a composer or performer—something that generates music in place of people. That’s a different and more complicated conversation, and one that raises valid questions.

But that’s not what I’m talking about here.

Used in the right way, AI isn’t making jazz. It’s helping musicians hear more clearly, understand more deeply, and engage more directly with the music.

And that, to me, is a good thing.

If you’re interested, I’ve written more about the technical side of this in my earlier article on AI stem separation.