Uncategorized
Morgan Blake  

How I Used AI to Create a John Summit EDM Hit

I never thought I’d be writing an article about making EDM with AI, but here we are. As a longtime electronic music fan with absolutely zero production skills, I’ve always envied those DJs who could whip up bangers that get entire festival crowds jumping. John Summit’s energy and tech house prowess particularly impressed me—those infectious grooves, those perfectly timed drops—but I figured creating anything similar would require years of production experience.

Then I discovered the wild world of AI music tools, and everything changed. Here’s how I went from bedroom dreamer to creating a John Summit-inspired track that actually got played at my friend’s house party (and no, they weren’t just being nice).

Starting From Zero

Let’s be clear: when I began this journey, my music production experience consisted entirely of making playlists and occasionally hitting the air drums too enthusiastically while driving. I had Ableton Live installed on my laptop for approximately three weeks before it became a very expensive icon I never clicked.

My first attempts using AI to create music were… interesting. And by interesting, I mean terrible. I prompted an AI music generator with “make a John Summit style drop” and got something that sounded like a washing machine falling down stairs while a kazoo played in the background.

Finding The Right Tools

After some research, I realized I needed a more sophisticated approach. I ended up using a combination of tools:

  1. Suno AI for generating melodic ideas and vocal snippets – this AI tool can create full-fledged songs within seconds based on text prompts and generates multiple tracks with vocals, background music, and instruments Kripesh Adwani
  2. ORB Producer Suite for crafting basslines and grooves – this powerful AI tool creates “an infinite number of patterns, melodies, and bass lines” and integrates with most DAWs Animotica
  3. Melody Sauce 2 for additional melody generation – with over “300 style settings, covering genres like Hip Hop, Trap, EDM, House, Techno, Pop and R&B” Productionmusiclive
  4. LANDR for mastering – an “AI-powered music production platform” that offers “a wide array of tools and capabilities for music composers” EDM Sauce

The game-changer was discovering that these tools worked best when I didn’t try to make them do everything. Instead, I used them as collaborative partners, guiding them with specific references while adding my own creative decisions.

Deconstructing The Summit Sound

To create something that captured John Summit’s essence, I needed to understand what makes his tracks work. John Summit is “a master of the turntables and a trailblazer in the realm of house music” whose “sonic voyage has centered around the pulsating rhythms of tech house, crafting an electrifying sound” EDM Wiki that’s become his signature style.

I learned that John Summit is “an expert in the Tech House subgenre, which combines elements of techno and house music to produce a distinctive sound that is upbeat and danceable” Viberate. His tracks feature sophisticated productions with bass-heavy sounds that get crowds moving.

I created a reference sheet with timestamps from his tracks, noting things like:

  • The filter sweep in “Human” (a track that became “a No. 1 US dance track” in 2021 Wikipedia)
  • The bass pattern from “Show Me”
  • The percussion arrangement from “La Danza”

This became my blueprint. Rather than asking AI to “make something like John Summit,” I could request specific elements: “Create a rolling bassline with similar rhythm and tone to the one in ‘Human’.”

The Production Process

Here’s where the magic happened. I started by generating a few bass patterns and drum loops using ORB Producer, selecting the ones that had the right energy. Rather than accepting whatever the AI created, I’d regenerate parts until they felt right, or edit them manually.

For the percussion elements, I applied what I learned about tech house production techniques: Tech House beats are typically characterized by “a four-to-the-floor style drum-beat” with “808 and 909 drum samples, groovy bassline, and heavily compressed percussion” Edmheaven to get that signature club sound.

I also discovered that “in more groove-oriented forms of house music like deep house and tech house, FX are normally more subtle sweeps” EDMProd than in other EDM genres, which helped me select the right atmospheric elements.

For vocals, I fed Suno AI lyrics I wrote that had that perfect blend of meaningful-yet-vague Summit vibe: “Shadows falling, bodies calling, we’re all in, we’re all in.” Not exactly Leonard Cohen, but with the right processing, it sounded legitimate over the track.

The arrangement was where AI really helped. I structured it like a typical tech house track, which I learned are “often based around a rhythmic ‘DJ-friendly’ intro, which leads into a beatless breakdown, followed by a drop where the beats and bass kick in” Native-instruments.

The Failures Were Spectacular

Not everything worked perfectly. My first drop attempt sounded like someone had released a flock of robot seagulls into a factory. Another time, the AI created a bassline so mathematically perfect that it looped without a single variation for four minutes—technically impressive but mind-numbingly boring.

The vocal generator once produced a hook that sounded amazing until my friend pointed out it was saying something that would make your grandmother disown you.

Each failure taught me something about what to adjust in my prompts and how to better guide the AI. It became clear that the best results came from being specific about the feeling I wanted rather than technical details. As SoundGuys’ guide to AI music points out, when crafting prompts, it’s better to use specific descriptions like “chill lo-fi beat with vinyl crackle, soft piano, and a rainy night vibe” rather than vague terms like “lo-fi beat” SoundGuys.

The Breakthrough Moment

After weeks of experimenting, something clicked. I had generated a driving bassline that actually made me involuntarily bob my head. The drum pattern had that perfect bounce. The vocal hook was catchy without being cheesy.

When I added a filtered synth stab that slowly opened up over 16 bars—classic Summit style—I knew I was onto something. I applied what I’d learned about side-chain compression, which is “a technique where the signal from one sound is used to manipulate the volume (or any other parameter) of a different sound” Edmheaven and is essential for creating that pumping effect in tech house tracks.

The drop hit, and for the first time, I didn’t immediately cringe. Instead, I turned it up and felt that rush of excitement you get when a track is working.

Testing In The Wild

I named the track “AI Summit” (subtle, I know) and nervously sent it to a few friends who are actual DJs. Expecting polite dismissal, I was shocked when one responded: “Did you make this? It’s actually pretty sick.”

The real test came when my friend played it at a house party. I didn’t tell anyone it was my track, just watched as people reacted. When the drop hit and people actually moved to it—when two guys looked at each other and gave that “this is good” nod—I felt like I’d witnessed a miracle.

What I Learned

This process taught me several things:

  1. AI works best as a collaborative tool, not a replacement for human creativity – the best results come when you use AI to “easily create new sounds” that can “save independent musicians a lot of time and effort during music production stages” Dittomusic
  2. Understanding music structure is still essential, even with AI assistance
  3. Specific, detailed prompts get better results than vague requests
  4. The human ear is still the ultimate judge—if it doesn’t feel right, regenerate
  5. Creating music that moves people remains an art, even with technological help

The Future Is Collaborative

I’m not claiming my track rivals anything John Summit has created. He’s spent years mastering his craft and worked his way up from local venues – he “taught himself how to use the equipment with YouTube tutorials” while developing his love for house music in Chicago’s underground scene Rolling Stone before eventually leaving his accounting job to pursue music full-time.

What I found fascinating is how AI tools lowered the technical barriers that would have previously made this impossible for someone like me. As SoundGuys notes, today’s AI music tools “are getting ridiculously good and easy to use” whether you want to “make beats, chill soundscapes, or even have an AI sing your own lyrics” SoundGuys.

I’ve since started learning actual music production using Komplete 15 Select Electronic Edition, which includes instruments ideal for making tech house. The AI gets me 70% of the way there, but that last 30%—the human touch, the emotion, the unpredictable creative decisions—that’s where the magic happens.

So if you’ve ever dreamed of creating music but felt intimidated by the learning curve, AI music generators might be your gateway drug. Just remember: they’re at their best when guided by human creativity, not replacing it.

Now excuse me while I finish my next track. My friend’s club night needs a new opener, and I have some ideas involving a Roland TB-303 emulation that I think the AI and I could collaborate on beautifully.

Here is my track below:

Leave A Comment