Privacy-Safe Retail Measurement: Traffic + CTV

Retail measurement in 2026 runs on a three-signal stack: in-store foot traffic (observed visit behavior from mobility panels), card-spend panels (aggregated transaction signal from banking and payment panels), and CTV attribution (exposure data from Connected TV networks, matched to store visits or card spend). Each signal class is operator-grade on a specific question, weak on others, and overlaps with the others in ways retail measurement teams frequently under-model. Media Rating Council (MRC) measurement standards and the IAB Tech Lab measurement framework have both codified the diligence bar — and post-Apple ATT, privacy-safe-by-construction is the operational floor. This piece is the working stack guide. For the catalog surface see CTV/Smart TV (ACR) Feed, Global Mobility & Location Data, Cross-Channel Measurement, and Retail industry hub; for the companion framing see foot-traffic vs credit card panels.

Key Takeaways

  • Foot-traffic panels (mobility-sourced visits to 26M+ US POIs) measure incidence and absolute visit counts — they answer "who visited this store" at the cohort level, privacy-safe by panel construction.
  • Card-spend panels measure spend magnitude and basket composition at aggregated-category level — they answer "what the visit was worth" and fill in the conversion side that foot-traffic alone doesn't see.
  • CTV attribution (exposure-to-visit or exposure-to-spend lift, from ACR-sourced household-level signal) measures the lift of media on the downstream behavior — it answers "did the campaign change anything" at the cohort level, not at the individual level.
  • The MRC measurement accreditation framework has hardened the diligence bar: invalid-traffic filtration, panel-composition transparency, statistical-significance disclosure, and methodology publication are now the floor, not the upside.
  • Panel overlap matters — foot-traffic and card-spend signal converge on the same underlying retail behavior but from different measurement angles, and the reconciliation (visit rate × basket size = trip revenue) is where coherent retail measurement lives.

Foot Traffic: What It Measures and What It Doesn't

Foot-traffic panels — GSDSI's Global Mobility & Location Data covers 26M+ US POIs with daily visit signal — measure observed visits to physical retail locations. The signal is cohort-level (panel-sourced, privacy-safe by construction), honest about incidence (who went in, at what daypart, from what origin tract), and honest about cross-shopping (which visitors also visited adjacent tenants). What foot traffic does not measure: spend magnitude (a visit to a flagship is worth more than a visit to an outlet, but both count as one visit), return-visit economics (loyalty behavior captured only via longitudinal panels), or basket composition (what the visitor actually bought versus what the category suggests). The operational role: foot traffic is the incidence measurement. Use it for site-level performance comparison, cross-market benchmarking, format-level analysis (strip center vs power center vs grocery-anchored), and category-level trend tracking. It is weak as a standalone revenue measurement because the visit-to-spend conversion varies materially by daypart, format, and season. For the companion framing see foot-traffic vs credit card panels.

Card-Spend Panels: What They Measure and What They Don't

Card-spend panels aggregate transaction signal at the merchant or merchant-category level, sampled from banking and payment networks under privacy-preserving construction (de-identified, aggregated, category-scrubbed). The useful signal is: spend magnitude per trip (basket size), spend velocity (trips per household per month), category mix at the basket level (what was bought), and longitudinal loyalty (return-visit economics by cohort). What card panels don't measure well: raw incidence (a panel misses the 30-40% of transactions done outside the panel's banking and payment coverage), and individual-store resolution for chains where transaction data is rolled up to merchant-category-code rather than store-level. The operational role: card panels fill in the spend-magnitude side that foot-traffic alone misses. Combined with foot-traffic visit counts, card panels let a retailer model visit-to-revenue conversion — and the conversion rate itself becomes a diagnostic of category health. Retailers whose card-panel conversion is below the comparable-format average are leaving trip-level revenue on the table; retailers whose conversion is above are either better at merchandising or pricing the category into up-sell — both are operator-actionable.

CTV Attribution: Exposure-to-Behavior Lift

CTV attribution (exposure-to-store-visit or exposure-to-card-spend lift, sourced from ACR-panel household-level signal) measures the lift of a media campaign on downstream retail behavior. The measurement logic: the ACR panel provides household-level exposure to a specific campaign creative; the foot-traffic or card-spend panel provides downstream behavior signal for those same households (matched via privacy-preserving overlap analysis); the lift is the difference between exposed-household behavior and unexposed-control behavior. MRC-accredited measurement requires panel-composition disclosure, invalid-traffic filtration, statistical-significance reporting, and methodology publication — and retailers should require the same bar from any ACR-sourced attribution vendor. What CTV attribution measures well: campaign-level incremental lift on visits or spend. What it measures poorly: individual-person attribution (the panel is cohort-level, not deterministic), and long-lag conversions beyond the measurement window (typical ACR-to-visit matching windows are 7-30 days; longer lag decays the signal). The operational role: CTV attribution closes the loop from media investment to retail behavior, which is the final measurement piece retail programs need after foot-traffic + card-spend. For the ACR framing see CTV/ACR 101: the signal, the economics, the privacy posture.

The MRC Accreditation Framework as Diligence Floor

The MRC measurement accreditation framework has codified the diligence bar that 2026 retail measurement should run against. The framework's core obligations: invalid-traffic (IVT) filtration (fraud, bots, non-human visits removed at source), panel-composition transparency (demographic and geographic skew disclosed, not hidden behind "representative panel" marketing language), statistical-significance disclosure (confidence intervals and sample-size minimums published per measurement claim), and methodology publication (the specific match-logic, attribution-window, and control-construction methodology disclosed for independent review). A retail measurement vendor that does not publish methodology, or that hides IVT rates behind headline numbers, is under-diligenced. The working principle: a measurement vendor's willingness to publish methodology is a proxy for the rigor of the underlying methodology. Vendors with strong methodology publish it because the publication generates auditor confidence; vendors with weak methodology hide behind "proprietary signal." Require publication as the baseline. IAB Tech Lab's measurement framework provides the complementary industry-standard definitions that retail measurement should align against.

Panel Overlap and the Measurement Reconciliation

Foot-traffic and card-spend panels both measure the same underlying retail behavior — they just measure different facets (incidence vs spend). When a retailer stacks both, the reconciliation check is straightforward arithmetic: foot-traffic visit count × card-panel average basket = expected trip revenue. A material gap between the reconciled number and the retailer's internal actuals flags either a panel-composition issue (the panels over- or under-index against the retailer's actual customer mix), an attribution-window mismatch, or a methodology difference that the diligence should surface. The gap is not a failure — it is a diagnostic. Healthy retail measurement programs run the reconciliation quarterly and adjust for composition-index correction factors that bring the panels into agreement with first-party actuals. The operational implication: stacking panels without reconciling them gives a false sense of measurement rigor. Reconciling them — even when the reconciliation shows gaps — is where the measurement bar actually gets set. For the cross-channel framing see Cross-Channel Measurement solution and the companion piece cross-channel measurement for privacy-first advertisers.

The 2026 retail measurement stack — foot traffic for incidence, card panels for spend magnitude, CTV attribution for exposure-to-behavior lift — is a coherent signal chain when the panels are reconciled and the methodology is MRC-grade. Retailers who operate this stack have clearer picture of category performance, clearer attribution of media investment to retail outcome, and clearer diagnostic signal on where merchandising or pricing is under-performing. For the catalog surface see CTV/Smart TV (ACR) Feed, Global Mobility & Location Data, Cross-Channel Measurement, and Retail industry hub.

Frequently Asked Questions

Do retail teams need all three panel types to run measurement?
Not all three for every question, but the overlap matters. Foot-traffic answers incidence (who visited); card panels answer spend magnitude (what the visit was worth); CTV attribution answers exposure-to-behavior lift (did the media change anything). Teams running only foot-traffic measure visits without revenue; teams running only card panels measure spend without the incidence denominator; teams running only CTV attribution measure media lift without site-level or category-level diagnostic signal. The stack is complementary.
What does MRC accreditation actually require?
The MRC framework requires invalid-traffic filtration at source (not just headline number correction), panel-composition transparency (demographic + geographic skew disclosed), statistical-significance reporting with confidence intervals and sample-size minimums, and published methodology for independent review. A vendor that hides methodology behind "proprietary signal" or that does not disclose panel composition is under-diligenced. The IAB Tech Lab measurement framework provides complementary industry-standard definitions.
How should retailers reconcile foot-traffic with card-spend panels?
Run the arithmetic: foot-traffic visit count × card-panel average basket = expected trip revenue. Compare to first-party actuals quarterly. A gap flags panel-composition mismatch, attribution-window difference, or methodology gap — not necessarily a failure, but a diagnostic to tune. Healthy retail measurement programs run the reconciliation as a routine check and apply composition-index correction factors when the panels under- or over-index against actual customer mix. For framing see foot-traffic vs credit card panels.
Is CTV attribution privacy-safe at the individual level?
CTV attribution is cohort-level, not individual-level — the ACR panel provides household-level exposure, matched via privacy-preserving overlap analysis to foot-traffic or card-panel behavior cohorts. The lift measurement is statistical (exposed cohort vs unexposed control) rather than deterministic per-person. Post-Apple ATT, cohort-level measurement is the privacy-safe-by-construction baseline. Individual-level attribution claims in CTV should be treated skeptically — they either run on non-privacy-safe signal or overstate the underlying signal's resolution. For the ACR primer see CTV/ACR 101.