Most AI-for-filmmakers headlines fall into the camp of existential panic or overhyped promise. And every so often, someone shows you a tool that’s actually useful.
That’s what happened last week when filmmaker Shawn McDaniel demoed Adobe Firefly’s new generative sound effects feature for IndieWire. He loaded a video clip into Firefly and gave it a prompt: “Footsteps on concrete.” Anyone who’s dug through a sound library knows how challenging that can be. Running? Walking? Stomping? Casual? Frantic? And then there’s the matter of editing the SFX to sync with characters’ movement.
Here’s what made the demo stand out: McDaniel used his own voice to shape the sound. As the scene played, he mimicked the timing, pace, and intensity of the footsteps. These weren’t recorded as dialogue, but as a kind of performance reference for Firefly’s AI. The tool then generated four synced sound options instantly. (Check out the demo below.)
And the options it created are… good. This “voice reference” feature lets creators direct sound the way they might direct a scene by using rhythm, emotion, and intuition instead of endless search filters.
Capturing that level of precision usually requires custom foley work, but in the video below you can see how Firefly generated sound effects like race cars turning, an elephant trumpeting, or a skier slaloming downhill using just text prompts and the voice reference tool. The way these quickly generated sounds accurately ramp up, sync up, and change direction is impressive.
The ability to quickly generate synced sound effects could be a game-changer for online video creators who produce content at a rapid pace. More traditional filmmakers might use it to generate temp sounds or experiment during the edit; it’s unlikely to replace the precision and artistry of a great foley artist anytime soon. Still, this voice-driven technology has real potential to evolve.
Just as important: These SFX are commercially safe because Adobe trained Firefly on content it owns or licensed. It’s always nice knowing you aren’t ripping off other artists’ copyrighted work (I’m looking at you, Google, OpenAI, and Meta).
Also new from Adobe: Enhanced motion control, style presets, and keyframe cropping; Firefly Boards now generate video using Runway Gen-4 and Google Veo3; and there’s a compositional reference tool to help match style and framing. Read full details of Firefly upgrades on the Adobe website.