Why it's so hard for creative orgs to adopt AI
The real change across creative industries has to happen under the hood
Generative AI tools are evolving fast and loudly. Everyone wants to talk about them because it’s fascinating, flashy, and future-facing. The creative industry has fixated on what these models can generate: sharper images, smoother video, faster copy. But the real transformation isn’t in what they output. It’s in what they demand.
This shift goes beyond technical know-how. It’s cuts deeply into the infrastructure of creative departments across media, film, entertainment, marketing, advertising, and brand. Under the surface, the most forward-looking creative organizations are rewiring how they operate. Not just plugging in new tools, but rebuilding the pipelines, platforms, and finding the people required to support them. Most agencies, especially independents and smaller firms, haven’t caught up and risk getting left behind. They think they’re in a race to pick the best model or AI tool. They’re not. They’re in a race to build the right system around it.
Stacks Are Forming
The AI layer of creative development is consolidating. What were once isolated experiments or a wide smattering of specific-use tools, image gen here, voice dubbing there, are coalescing into full-stack, multi-modal systems at big firms. Each with its own philosophy, governance model, and creative aperture.
Adobe is the most fully formed. Firefly has matured into a production-grade platform, with enterprise APIs, usage rights clearly baked in, and a partner model ecosystem to bring it all together. It’s now the creative engine behind Adobe GenStudio and integrated across Photoshop, Premiere, Illustrator, and Workfront. WPP, one of the largest holding companies that provides communications, advertising, public relations, and other services, is embedding Firefly across its 35,000-person workforce via WPP Open. Publicis, another major advertising firm, is folding it into CoreAI, its proprietary operating system. IBM credits Firefly with cutting its design cycle from two weeks to two days.
Canva has now acquired Leonardo.ai, which represents another successful stack in the market. It touts streamlined, UX-first build, and aimed at democratized design. Canva’s interface serves as the control panel, Leonardo as the generative backbone. This pairing doesn’t just compete with Adobe. It caters to a different layer of the market: fast, frictionless, good-enough-at-scale production for internal teams, educators, and content marketers.
Then there’s Runway, which is building vertically rather than horizontally but will serve as a key pillar for many creative stacks. Its Gen-4 model is no longer a simple video tool. It’s part of a stack optimized for IP-secure, studio-grade AI production and the improvements are staggering. Lionsgate is training custom video models on its film catalog. Other production studios are sure to follow. We’re going way beyond experimentation and moving towards full-scale integration into a narrative production stack.
These ecosystems are focusing on integrating APIs, asset governance, user roles, and monetization logic. And they demand to be evaluated like infrastructure, not apps.
Middleware Is the Battleground
The model gets the spotlight. But the middleware determines whether it scales.
WPP’s Open, Omnicom’s ArtBotAI, Publicis’ CoreAI, and IPG’s Engine show us how the big firms are blowing past the “internal experiment” trap. They are quickly becoming orchestration layers baked in to the process. They connect generative output with asset libraries, client templates, DAM systems, first-party data, media buys, and performance dashboards.
This is the connective tissue. It routes outputs to the right people, checks for compliance, aligns with brand rules, feeds back into performance reporting, and re-triggers iteration. It’s less glamorous than the model. But without it, the model has no stickiness.
What sets leading orgs apart isn’t who’s using GPT-4o or Firefly Ultra. It’s how that model connects, triggers, learns, and delivers within their system.
Ask your team: Do we have version control for AI outputs? Are they taggable, retrievable, reviewable? Is there a system for human QA and approval? Can an AI-generated concept become an asset, then a campaign, without rework? Are my creatives equipped and ready to execute in this new environment?
Most agencies don’t have middleware. They have folder chaos, disorganized subscriptions, and Slack threads. That’s not scalable. That’s a messy bottleneck.
Use Case Convergence
The old mental model was tool-specific: a power user on the team playing with popular text-to-image tools, the production team experimenting with a flashy new AI dubbing tool, etc. It’s not surprising that this is breaking down. After all, what org-wide tool adoption example can you think of that worked this way? The real shift is convergence.
Gatorade’s consumer personalization engine, for example, isn’t just for fans designing custom bottles. The same back-end powers internal prototyping, merchandise design, and campaign assets. One system. Multiple surfaces.
Netflix is building an internal AI platform that supports dubbing, metadata tagging, promo asset generation, and interface personalization using shared prompt infrastructure and model orchestration. It’s not just about saving money. It’s about building shared creative logic across products.
What this means: Companies that treat AI tools as independent widgets will get buried under workflow sprawl. Agencies that build interconnected systems will unlock compounding returns. Efficiency, yes. But also creative cohesion.
Workflows, Not Just Output
Another issue is the tendency to treat GenAI like a post-production accelerant. Like it’s something to make the end go faster. But the real gains come when AI is embedded earlier.
Mattel uses Firefly to generate Barbie packaging concepts before creative is even locked. That used to be rounds of sketches and back-and-forth. Now, a prompt yields a visual close enough for everyone across design, marketing, and sales to align around. It’s faster, aligned, and effective.
IBM’s 1,600-person design team uses AI to scale ad variants, localizations, and strategic visual testing. It’s the same team. But they’re producing 10x the output because the workflow changed, not because the model did.
The real unlock isn’t AI-as-tool. It’s AI-as-trigger. Integrated upstream, not just tacked on downstream.
Talent and Teams Are the Enablers
You can’t retrofit your org to support this shift. You have to rewire it.
WPP launched a creative tech apprenticeship to build hybrid talent fluent in AI, code, and design. R/GA formed an AI advisory council and committed $50 million to building IP and retraining teams. Agencies are hiring AI creative ops leads, AI QA reviewers, and prompt specialists, not as experiments, but as structural necessities.
This goes way beyond talent acquisition. It’s about anchoring new workflows. Creative QA loops. Governance models for operating in this environment safely. This is how you maintain brand integrity in a generative pipeline.
As mentioned in some of these examples, it’s not limited to creative agencies. Netflix, Disney, Mattel, IBM, they’re writing AI fluency into job descriptions, embedding it into team rituals, and tying it to performance outcomes.
If your org still sees AI as a tool to learn rather than a system to staff, you’re not behind. You’re structurally unprepared.
The Playbook
The agencies pulling ahead aren’t doing more experiments. They’re making fewer, better-integrated bets. Their systems are designed around a few clear principles:
Move AI upstream. Don’t wait until the asset is final. Use AI for moodboarding, concepting, stakeholder alignment.
Invest in middleware. Build orchestration layers that tie AI output to brand assets, performance data, and review flows.
Modularize your creative. Treat every asset as a bundle. From image, copy, context, language, and intent that’s ready to version.
Staff to the system. Hire roles that translate between prompt and platform, between creative intent and technical execution.
Codify the loop. Build feedback and refinement into the system.
The Real Unlock
The creative future isn’t being shaped by better tools. It’s being built through better infrastructure. Quietly. Deliberately. Internally.
Agencies that treat GenAI like a feature will stay trapped in pilot purgatory. Those that treat it like a system will unlock a new scale of creative possibility, measured not in time saved, but in capacity expanded.
Model selection matters. But only if your system is ready to absorb it.
The next frontier isn’t who prompts best. It’s who builds best beneath the surface, in the workflows no one sees but everyone feels.