Tempo fluctuations in drum programming

Discussion in 'Drums & Percussion' started by Drew, Jan 20, 2020.

  1. bostjan

    bostjan MicroMetal Contributor

    Messages:
    15,219
    Likes Received:
    3,396
    Joined:
    Dec 7, 2005
    Location:
    St. Johnsbury, VT USA
    Songs recorded without a click definitely vary in tempo quite a bit. But I truly don't think there is any way it varies consistently.

    Some drummers speed up throughout the course of a song. Some slow down.

    Some drummers rush fills. Some drag them.

    Some drummers speed up the chorus. Some slow down.

    It's all part of what makes the song sound like a living thing. You might say it depends on the context, but, I say it creates the context around which the song is created.

    If you want to program drums to sound like a real drummer, add in these quirks, do it somewhat consistently, but not 100%, play along with your drum tracks, adjust the characteristics more, erase tgem, improve, repeat, etc.

    Forgive the shameless plug, but in 2018, I released an album with a prototype AI drum process I made... you can hear it here: https://naegleriafowleri.bandcamp.com/album/life-cycle. It's not a straightforward process at all, but I'll try to explain what I was trying to do and the limitations and what went into it.

    So, this took hundreds of hours, and the result isn't great. With another thousand hours and ten times as much data, I think it'd maybe be more useable. The first thing was to make something that could parse guitar into a tempo to pattern over. There were some utilities already out there that could do so over a steady pattern of pulses, but what about riffing. If I told it a starting tempo and it looked at the audio in real time, it almost always got confused and sped up. So I pinned the tempo at set timestamps and gave it the entire audio clip to digest. Then, working forward and backward from pinpoints, the tempo would fluctuate much more naturally. My overall goal was to develop enough data to teach it what to look for when mapping tempo, but I never made it quite that far.

    Next, by doing Fourier analysis on the detected pulses, it'd categorize the notes into categories relative to each other, and find repeats to map time signatures. From the pattern of ups and downs in the audio, it would fit snare and kick over a backbeat from a library of patterns. The null state for any time signature was snare on 2 and 4, essentially, and it perturbed the state from there.

    All of this was heavy on trial and error. I had to feed the thing the tempo pinpoints, the audio, the library of midi fills and a binary file of backbeats, and then it'd output a midi file. If it sounded like garbage, I manually cleaned it up, and fed that back into another program that tried to learn what was going on. And, up to this point, everything was on a grid.

    I was working on another program to humanize the midi track off of the grid, based on the same audio clips, which I thought could just snap to my playing, but I wasn't happy with that result. I wanted something that sounded more organic. What I ended up doing on some of the songs was to push the notes' timing somewhere between (I'd input a percentage, for example 50%) the grid note and my performed note on guitar, then use the DAW's humanize effect to shift those notes randomly. Again, a lot of trial and error.

    As I tried to develop this program, I tried playing with pushing and pulling the tempo over different "zones" between timestamps of song sections. Of course, these were based off of static audio files for guitar and/or bass. I either never got the hang of it or just couldn't get a good result with that, and eventually edited that out of the code.

    By the time I got to the last song, I quickly programmed drums, input the midi for that into my program along with the guitar and bass audio tracks, and had the program itself add or remove notes based on the algorithm it developed itself from my feedback, I removed tempo perturbation, and had the other program humanize the midi track I chose from several outputs.

    The basic flow was to program simple scratch drum tracks, record guitar and bass, pin the starts and stops of sections by noting the timestamps, input the pins audio and sometimes original drum midi into the program with the midi and binary libraries, put the output midi into my DAW, listen to it against my other audio, repeat until I had something I liked, run that through my other program to push notes closer to my audio, humanize it in the DAW, then do the rest of the processing.

    In other words, all of the tempo dragging and pushing I had played with made the end product sound worse, unless it followed my own audio, unless it followed it 100%, then it sounded even more robotic and lifeless.
     
    c7spheres likes this.
  2. Drew

    Drew Forum MVP

    Messages:
    29,348
    Likes Received:
    4,867
    Joined:
    Aug 17, 2004
    Location:
    Somerville, MA
    Very interesting post, man. Thanks!
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.