When the Plan Deletes Itself: Letting AI Interrupt the Creative Process
- William Hopson
- Jan 16
- 2 min read
Last week was about control.
Very intentional control.
Tone shaping per instrument.
Deliberate decisions.
Everything dialed in exactly where I wanted it.
This week, that plan deleted itself.
Not metaphorically — literally. The original idea unraveled, got replaced, unraveled again, and eventually turned into something that didn’t resemble the plan at all. Which, in hindsight, was the most honest direction the week could have taken.
I’d been traveling. Some of the recording happened out of state. The schedule was off. The headspace was off. And then, in the middle of all that, I got an email asking for an honest review of an AI music platform. Not a first impression. An honest review.
That request didn’t derail the plan.
It exposed how fragile the plan already was.
So instead of forcing structure, I leaned into the chaos.
The Experiment Wasn’t About AI — It Was About Letting Go
The original idea was simple: a math-rock–inspired piece using DADGAD tuning and odd meters. I wanted it to bounce between 5/8 and 7/8, living in that uncomfortable space where repetition becomes hypnotic instead of predictable.
I recorded the progression first. No effects. Just the loop.
And then I made a decision that directly contradicted last week’s philosophy:
I let AI choose the effects chains.
Not to replace creativity.
Not to “generate music for me.”
But to interrupt the habits I didn’t realize I was protecting.
The pedal I used wasn’t even purchased for this purpose. It was bought as a backup looper, a contingency plan in case my main looper — notoriously picky about power — gets fried again. I didn’t buy it because it had AI. I didn’t know it had AI. I found that button by accident.
Which feels important.
Some of the most interesting tools don’t enter our workflow through intention. They enter through necessity.
When Tone Stops Being Sacred
Last week, the goal was to make multiple instruments feel like one cohesive voice. This week, I did the opposite.
I used two different acoustic guitar patches the AI generated. One stayed in DADGAD and got panned left. The other was recorded differently and panned right. Together, they didn’t sound like one guitar. They sounded like two guitars in conversation.
That difference mattered.
When tone is no longer sacred, arrangement starts to matter more.
When effects are chosen externally, performance becomes the variable.
When you stop protecting the sound, you start reacting to it.
That shift alone was worth the experiment.
What This Experiment Actually Proved
This wasn’t about AI writing music for me.
It was about AI forcing decisions I wouldn’t have made otherwise.
The structure I planned disappeared.
A new structure emerged.
No one ever lost control — but I did have to give up assumptions.
And that might be the most important takeaway.

Creativity doesn’t always come from better planning.
Sometimes it comes from letting the plan fall apart and responding honestly to what’s left.
If you want to see more experiments like this — where tools interrupt process instead of replacing it — that conversation is always open.
Until next time:
may your gear be light,
your latency low,
and your dogs quiet when you track vocals.



Comments