SAN JUAN, Puerto Rico — The promise of artificial intelligence in advertising has long been one of simplification and speed, a force to unify disparate signals into clear, actionable insights. In practice, however, the technology is often creating the opposite effect, adding new layers of fragmentation and opacity to an already complex media ecosystem.

Instead of a single source of truth, marketers are now contending with intelligence that lives inside “siloed, closed gardens.” The result is a landscape where different large language models provide different answers.

This is the current reality for media agencies navigating the AI landscape, according to Sierra Tobias, evp, Verizon (V-One Media), Zenith. In this video interview with Beet.TV, she explained that while AI is delivering real value, its potential is being hampered by a fragmented infrastructure that demands more human oversight, not less.

Humans still in the driver’s seat

Despite the rapid acceleration of automation, human judgment remains important for protecting brand strategy and achieving business goals, Tobias said.

“Humans are definitely in the loop,” Tobias said. “If we have garbage in, you get garbage out. So ensuring that we have the right overall objectives and KPIs set, a bot or an agent can’t do that for you.”

The thinking is, even with a strong brief, the sheer volume of data AI can process requires a human filter to determine what is truly relevant. “Humans are required to decide what to action on and what will be meaningful for our objectives,” Tobias added. “It’s critical that we have humans involved and decisioning on the data throughout end to end.”

The measurement muddle

While AI-powered bid engines are successfully making media buying faster and more agile, the industry’s ability to measure true effectiveness is lagging. Tobias argued that a major roadblock is the absence of a universal standard for assessing performance across channels.

“There are still a lot of gaps and within measurement specifically, we don’t have that single unified measurement and that’s really the place that we need to get to as an industry,” she said. The current tech infrastructure is not yet built to support a “unified decisioning tree.”

A fundamental part of the problem, she noted, is that the industry remains reliant on historical performance data. “How we think about (it) right now is, a media and determining effectiveness is all based on lagging metrics and we want to be moving forward and thinking about leading metrics,” Tobias said.

Data transparency is critical

Without a unified view, optimization could happen in isolation, often without clear visibility into the underlying data driving the decisions.

“Right now, I’d say AI is creating more complexity in our overall landscape. And the reason for that is the AI is sitting in very siloed, closed gardens,” Tobias explained. “You can ask one LLM a question and you won’t get the same answer across another.”

This opacity creates a risk for agencies and their clients, particularly when it comes to understanding consumer behavior. “It’s critical that we understand what the audiences are doing, how they’re behaving, and what we’re actioning on because if we start to outsource that audience understanding, that’s going to be a big hindrance to media agencies and clients moving forward,” she said.

The solution, according to Tobias, is to work backward from a client’s business goals to construct a bespoke and agile data infrastructure. “We definitely need to start with business outcomes,” she said. “That’s very foundational to how we think about leveraging this technology,” requiring the right data stack to be built from the ground up.

You’re watching coverage from Beet Retreat San Juan 2026, presented by Alliant and TransUnion. For more videos from this series, please visit this page.