When Signals Are No Longer Reliable: The Changes Taking Place in Programmatic Advertising Transactions

April 24, 2026

When signals are no longer reliable, the changes taking place in programmatic advertising transactions

Many DSP and SSP teams encounter an unexplainable situation when reviewing their placement and monetization data.


Some traffic appears fine based on surface-level data—clicks, dwell time, and even some conversion metrics fall within normal ranges. However, when this traffic is analyzed over a longer period or within the context of overall ROI, it becomes clear that these seemingly normal signals haven’t generated corresponding real value.

Initially, this issue is often attributed to models, strategies, or the traffic structure itself. But as more teams investigate further, they gradually realize: the problem doesn’t necessarily lie with one side—it may stem from the "signals" within the transaction chain itself.

From a transactional perspective, programmatic systems rely on "signals," not "traffic"

In the programmatic ecosystem, DSPs, SSPs, and ADXs each play distinct roles. SSPs organize and output, DSPs bid and optimize based on objectives, while ADXs, positioned between them, facilitate matchmaking and distribution.

However, regardless of which side you're on, what truly drives decision-making isn't "traffic volume" but rather the "quality of signals carried by the traffic during transactions."

A bid request moving from SSP to ADX and then to DSP essentially represents the transmission of a set of signals:

l Request context (device, environment, placement)

l User-related features (identifiable or modeled information)

l Available behavioral and historical data

DSPs use these signals to make bidding decisions, and subsequent clicks and conversions feed back into the model for continuous optimization. In essence, the entire ecosystem revolves around "signals" rather than "traffic."

IVT issues are beginning to affect the "signal layer"

Invalid traffic (IVT) has always existed in the industry. Third-party organizations like Pixalate have long monitored IVT rates across various channels.

However, at this stage, IVT is transitioning from being a "traffic issue" to becoming a "signal issue." Especially with more complex SIVT, which can mimic "seemingly reasonable" behavioral characteristics—such as click-through rates within normal ranges, more natural dwell times and scrolling behaviors, and in some cases, even conversion traits.

In a typical programmatic process, signals flow along the following path: SSP → ADX → DSP → user behavior → data feedback → model update. If the signals carried by certain inventory are biased, this bias won’t remain confined to a single node; instead, it will be amplified throughout the entire chain:

From a pathway perspective

DSPs adjust bidding strategies based on these signals, amplifying similar traffic patterns, gradually tilting budgets toward these paths, and reinforcing model judgments with feedback data.

From a results perspective

Certain inventory continues to receive higher bids in transactions, maintaining stable intermediate metrics (CTR, engagement), but overall conversion efficiency or real value declines.

This phenomenon is fundamentally no longer just an IVT issue—it’s the manifestation of signal pollution within the trading system.

AI hasn’t changed the problem, but it has accelerated its spread. With AI widely applied on the DSP side, bidding and optimization have become highly automated. Real-time bidding, automatic bidding, and dynamic budget allocation allow systems to make large-scale decisions in extremely short timeframes.

But this efficiency gain also brings a structural change: once signals deviate, they are amplified faster and on a larger scale. The reason isn’t complicated—models rely on historical data for learning, update frequencies are high, adjustment speeds are fast, and decision paths form closed loops. In such systems, the quality of the data directly determines the direction of optimization.

Precisely because of this, as the proportion of automated traffic continues to rise, the industry's focus has shifted from "traffic volume" to another critical question: are the signals transmitted by this traffic in transactions trustworthy?

The problem needs to be addressed before the transaction

Under this shift, the role of ADX is undergoing subtle changes. In the past, ADX was mostly seen as a platform providing traffic aggregation and distribution, improving fill rates and revenue. But when the issue shifts from "traffic" to "signals," merely facilitating transactions is no longer sufficient.

Because once signals enter the feedback loop, the model has already been influenced. This means that more effective control points must be moved earlier, to the pre-bid phase, where signal judgment and filtering occur before bidding decisions.

Taking Bidnex as an example, Bidnex’s self-developed AI real-time monitoring system evaluates traffic authenticity and outputs a quality score within milliseconds for each ad request. This score directly connects to the bidding engine: high-quality traffic gets higher bids, low-quality traffic receives lower bids or is outright abandoned, and invalid traffic is filtered out during the bidding phase, not entering subsequent processes.

This approach of front-loading quality assessment into the decision-making phase is becoming a standard feature of the new generation of ADXs.

From "facilitating transactions" to "managing signals," the focus of ADX capabilities is shifting

Some ADXs are adding new layers of capability on top of their original transaction-facilitation functions.

For instance, multi-layer cross-validation. Single internal detection models have blind spots; a more effective approach is a dual-layer architecture combining "self-developed algorithms + third-party verification."

Bidnex, building on its real-time recognition with proprietary models, deeply collaborates with internationally authoritative anti-fraud platforms like HUMAN and Pixalate, introducing MRC-certified IVT detection capabilities to form a complete chain of "detection → verification → auditing," ensuring both real-time performance and objective detection standards.

There’s also the real-time feedback loop. When the system detects that certain conversion signals come from IVT, these signals need to be removed or down-weighted from the model training data to prevent the model from learning incorrect patterns.

Bidnex cleans up conversion signals marked as IVT while continuously monitoring the quality metrics of model inputs and outputs, ensuring the model keeps optimizing on clean data. This step is still missing in many current solutions.

These capabilities don’t directly alter "traffic volume," but they influence a more crucial variable: the quality of signals entering the DSP decision system.

Conclusion

Based on current industry practices, a consensus is gradually forming: the core issue in programmatic advertising is no longer just about "how to acquire more traffic"—the key lies in ensuring the reliability of signals flowing through transactions.

This reflects a deeper underlying change:

l SSPs don’t just provide inventory—they also need to focus on signal quality

l DSPs don’t just optimize bids—they rely on cleaner input data

l ADXs don’t just facilitate transactions—they also start to take on roles in signal filtering and reconstruction

The scale of programmatic advertising continues to grow, but its underlying logic is changing. As ad placement and monetization increasingly depend on automated systems, what truly determines outcomes is no longer just algorithmic capability or traffic volume, but whether the data relied upon by the system is trustworthy.

In this context, the ADX is no longer just connecting supply and demand ends—it’s becoming the infrastructure layer that connects "signals" with "decisions," and this layer is becoming increasingly critical.

Bidnex Team