Thought Leaders
AI in 2025: Still Your Fastest Intern, Not Your Creative Director

This summer, Dust on the Wind by Velvet Sundown hit a million Spotify streams in a week. The buzz was real: moody vocals, nostalgic lyrics, beautiful cover art, a backstory about a reunited band in a beach cabin.
Velvet Sundown was entirely AI-generated on Suno â vocals, visuals, lore. No humans, no instruments, no cabin. And everyone could tell.
This case reveals something the hype merchants and the doomsayers both miss: AI has already eaten huge chunks of the creative workflow, but it still canât own the creative act itself. Itâs astonishingly good at producing, repurposing, formatting, and optimizing. It is far less convincing at sensing cultural nuance, inventing new genres, or deciding when to take a reputationally risky swing.
In other words: AI is a terrifyingly competent intern, not a creative director. And the smartest creative leaders are evolving into creative orchestrators who design prompts, curate outputs, and stitch human judgment into increasingly autonomous pipelines.
How AI accelerates creative work
The productivity gains are undeniable. Runwayâs Genâ3 Alpha, Lumaâs Dream Machine, plus Googleâs Veo 3 and image-to-video features rolling into the Gemini and YouTube Shorts with SynthID provenance, and Kuaishouâs Kling now globally available have pushed textâtoâvideo from demo to daily production option. âTIMEâ named Runway Gen-3 Alpha a âBest Inventions of 2024â honoree, which is another sign gen-AI is in the pipeline, not the periphery.
For structured, repeatable, midâskill tasks (grammar correction, summarization, formatting, firstâdrafting) AI delivers a serious productivity boost. Experts estimate generative AI could add $2.6â$4.4 trillion in annual economic value, and randomized experiments show ChatGPT improves the speed and quality of professional writing, especially for less experienced workers. Thatâs intern energy: fast, eager, and fantastic at scaffolding. And yes, leading LMMs like OpenAIâs GPT-5 Pro, Anthropicâs Claude 3.5 Sonnet, and Googleâs Gemini 2.0/2.5 now natively juggle text, image and audio inputs, pushing this acceleration deeper into real workflows.
We see the same pattern in audio and music. Tools like Suno and Mubert let anyone generate productionâready tracks in seconds â brilliant for drafts, mood boards, temp tracks, and social derivatives, but rarely the stuff of timeless, cultureâshifting sound.
Where it still fails: cultural intuition, originality, risk
Yet, every attempt to let AI âdirect,â reminds us where the line still is. Marvelâs Secret Invasion caught heat for outsourcing its title sequence to an AI system, and Sports Illustratedâs AI-written product reviews detonated its credibility. Both moments underline the same point: absent a strong human hand, AI outputs are recognizable, culturally toneâdeaf, or generic.
A growing body of research finds LLM/LMM suggestions homogenize style and compress cultural nuance, often nudging writers toward Western norms and reducing collective diversity of ideas. Translation: AI makes average work better, but it also makes different work more similar. Thatâs the opposite of what a real creative director is paid to do.Â
This is the âcreative ceilingâ many of us feel when we scroll through endless Midjourneyâstyle key art or hear yet another AIâgenerated pop chorus: the models are trained on yesterdayâs taste and optimized to predict the median next token. They mirror trends rather than set them.
Why agentic AI wonât replace taste
2025 is the year âcopilotsâ turn into agents. Systems set subâgoals, call tools, iterate, test, and refine without a human holding their hand at every step. McKinsey, PwC, the 4Aâs are all pointing in that direction.
Even fashionâs Vogue Business conveys the same mood: agentic stacks tweak ad visuals, swap copy, and re-budget media in real time. Manus AI and others now market themselves as âautonomous employees,â promising end-to-end execution.
Thatâs powerful â but it doesnât make the agent the director. It just means the orchestration layer (what used to be one creative lead and a spreadsheet) is becoming software. Someone still has to define taste, decide which cultural risks are worth taking, own accountability when the agent crosses a line, and know when to throw the playbook away and invent something the model canât autocomplete.
Legal and economic implications
Thereâs more: copyright and attribution questions multiply as AI-generated content becomes indistinguishable from human work. In the EU, staged AI Act obligations now require GPAI providers to meet transparency and copyright duties as of Aug 2, 2025, with additional milestones through 2026â2027. Creatively, the gap widens between those who master AI orchestration and those who resist it entirely. Economically, mid-level creative roles face displacement while senior creative strategists become more valuable.
Companies that figure out the human-AI division of labor first will capture disproportionate market advantages. Those that automate everything risk the Velvet Sundown trap â technically proficient but culturally hollow output.
What creative leaders (and their teams) should do now
- Become prompt/system designers, not just brief writers. Treat prompts, guardrails, and evaluation metrics like living design systems. Build your brandâs private corpus so your models stop defaulting to median internet tone. (Yes, that means investing in data governance and retrieval pipelines.)
- Install âcultural editorsâ in the loop. Research shows AI pushes toward Western norms and sameness; counter it with local reviewers, diverse reference boards, and red-teamers fluent in irony, slang, taboo.
- Pilot agentic stacks in lowârisk parts of the funnel. Let autonomous agents A/B test thumbnails, email subject lines, or performanceâmarketing variants before you hand them the Super Bowl spot. Use emerging frameworks (McKinsey, PwC, 4Aâs) to set autonomy levels, escalation paths, and audit logs.
- Upgrade your production toolbox, but keep a human aesthetic bar. Runwayâs Genâ3 Alpha camera controls, Lumaâs Dream Machine, and Googleâs Veo 3, now tied to Gemini/YouTube with visible and invisible watermarks, are perfect for preâviz, animatics, and fast iteration. The final cut, however, still needs a human to decide what shouldnât be there.
- Train for interpretation, not keystrokes. At JETA, we see the dividing line exactly where the expert in our Q&A drew it: if the task is structured, automate it; if it demands abstract thinking, risk assessment, or creative interpretation, put a human on the hook. That division of labor is only going to sharpen as agentic systems mature.
Labels, provenance, and the value of ârealâ
So, is AI the new creative director? Not yet, at least not before AGI. When (or if) general intelligence arrives, the economics and aesthetics will rebalance. Whatâs already clear: provenance will matter more as labeling rules normalize âmade-by-AIâ disclosures (YouTube Shorts will watermark Veo-powered clips with SynthID; C2PA keeps spreading in imaging pipelines.)
In parallel, markets keep rewarding human craft. Vinyl logged ~$1.4B in U.S. revenue in 2024, which is the 18th straight growth year. Audiences can feel tactility on screen: Oppenheimerâs Trinity test, realized with practical effects rather than CGI, or Top Gun: Maverick strapping actors into real jets. These are choices that signal authenticity, risk, and taste.
The near future is hybrid: industrialized, AI-generated content at scale, plus a counter-trend toward the raw, the amateur, the handmade, the human. As AGI looms, human-made experiences wonât disappear; they may become more prized precisely because they arenât machine-made. Creative directors will evolve into creative orchestrators: deciding when to lean on GPT-5 Pro/Gemini/Claude agents and when to go human-only. And for as far ahead as we can see, the highest-value work remains the bold, the weird, the culturally precise, and the strategically risky. And that still requires a person in the chair.