RealEyes Measures Attention Without Emotional Persistence

by Nick Clark | Published March 28, 2026 | PDF

RealEyes uses webcam-based facial coding and attention tracking to measure how audiences emotionally respond to advertisements, video content, and digital experiences. The platform scores attention, engagement, and emotional valence in real time as viewers watch content, providing advertisers with frame-level emotional response data. The measurement technology is validated and commercially deployed at scale. But each viewing session is analyzed independently, and no persistent emotional state connects a viewer's response to one piece of content with their response to the next. Resolving this requires affective state as a deterministic control primitive with fields that persist across interactions and evolve according to governed temporal rules.


1. Vendor and Product Reality

RealEyes, founded in London in 2007 and now headquartered in New York with engineering operations across Europe, is one of the most established commercial vendors in the audience-emotion-analytics category. Its platform applies computer vision to standard webcam feeds collected via opt-in consumer panels and study participants, detecting facial action units, micro-expressions, gaze fixation, head pose, and attention indicators while a viewer watches advertising or media content. The system classifies frame-level emotional state into the canonical affect categories — happiness, surprise, sadness, confusion, disgust, fear, neutral — and emits second-by-second attention curves alongside engagement and emotional-valence scores aggregated across panels.

The deployment scope is real. RealEyes has measured response to hundreds of thousands of advertising creatives across global markets, and its data has been used by major consumer-packaged-goods advertisers, automotive brands, streaming platforms, and global media agencies for pre-test optimization, in-flight creative diagnosis, and copy-testing benchmarks. The pipeline handles webcam variability, lighting heterogeneity, and demographic diversity, and the company has invested in calibration models so that aggregate engagement scores normalize across hardware. The classifier outputs are validated against self-report and physiological reference data, and the company publishes meta-analyses correlating its attention metrics to brand lift, sales lift, and content recall.

Within its scope the product is rigorous. RealEyes is an exemplar of what the analyst community calls "emotion AI for media measurement" — a category that also includes Affectiva (now part of Smart Eye), iMotions, Tobii's media analytics, and a long tail of attention-measurement startups. The competitive frontier is signal richness within a viewing event: more cameras, better classifiers, eye-tracking integration, biometric fusion. RealEyes operates at the leading edge of that frontier. The architectural shape is consistent across the category, however: each viewing session is a self-contained measurement event whose output is a curve over time during the content, aggregated across viewers into creative-level scores. The product is the reference implementation for per-session audience-emotion measurement.

2. The Architectural Gap

The structural property RealEyes' architecture does not exhibit is persistent affective state across content interactions. The platform records, for any given viewing, that a viewer's attention rose at the third second, dipped at the eighth, and recovered with positive valence at the close. It does not maintain — and architecturally cannot retrofit within its current model — a per-viewer affective field that persists between viewings, accumulates from prior exposures, decays under governed temporal rules, and conditions the interpretation of the next viewing. The session is the unit of measurement; nothing connects the sessions into a trajectory.

A viewer who watches three advertisements for the same brand over two weeks has an evolving emotional relationship with that brand. The first ad generated curiosity. The second produced familiarity and mild positive affect. The third should build on the accumulated emotional context. Per-session measurement treats each viewing as independent. The third ad's emotional impact is scored without reference to the trajectory established by the first two, and the platform cannot distinguish a viewer whose neutral-valence response reflects fatigue from accumulated exposure from a viewer whose neutral-valence response is a baseline state with no prior context. The two viewers receive identical scores for behaviorally distinct emotional realities.

The gap matters because the commercial questions advertisers actually ask depend on trajectory rather than snapshot. "Where is this viewer in the emotional arc with our brand" and "what content sequence sustains engagement across the campaign" and "at what exposure does fatigue threshold cross from build into burn" are trajectory questions. Per-session scoring averaged across panels approximates them statistically but cannot answer them at the level of an individual viewer's evolving state. Aggregate analytics are retrospective summaries of snapshots; they are not the dynamics of an emotional system. An average engagement score across three viewings reports the mean response, not the rate of change, the coupling between residual brand warmth and renewed attention, or the projection of where the trajectory leads under continued content exposure.

RealEyes cannot patch this from within its current architecture because the platform was designed as a measurement instrument over discrete viewing events, not as a substrate that maintains per-viewer affective state evolving under governed dynamics. Adding a customer database that stores prior session scores does not produce persistent affective state in the dynamical sense; it produces a log of past measurements. Adding ML-based recommendation across the log does not produce governed temporal evolution; it produces a similarity model. The persistence required is an architectural shape — a deterministic control field with named components, asymmetric update rules, governed decay constants, and cross-component coupling — and the measurement architecture does not exhibit that shape.

3. What the AQ Affective-State Primitive Provides

The Adaptive Query affective-state primitive specifies that emotional state in a conforming system be represented as a deterministic control field with named, persistent components updated under governed temporal dynamics. The field is not a classifier output and not a database row; it is a structured object that lives between interactions, evolves under defined rules in the absence of input, and updates asymmetrically when new observations arrive. Brand warmth, attention salience, novelty pressure, fatigue load, valence balance, and trust slope are example fields. Each has a governed update function, a governed decay constant, and a defined coupling to other fields in the system.

The asymmetric-update property is load-bearing. Positive evidence and negative evidence do not move the same field at the same rate. Brand warmth accumulates slowly under positive exposures and decays gradually in their absence, but a single salient negative event can drop it sharply — an asymmetry consistent with how humans actually form and lose emotional investment. Governed decay means the field returns toward a baseline at a rate set by the field's character, not by the cadence of measurements; warmth that took twelve weeks to build does not vanish because no measurement happened this week. Cross-field coupling means that high accumulated warmth can sustain attention through content that would lose a viewer with no emotional investment, and conversely that fatigue load damps the magnitude of new positive contributions. These dynamics are computable only when emotional state persists as named fields with defined update and decay rules.

The primitive is technology-neutral — any signal source, any classifier upstream, any storage substrate downstream — and composes hierarchically: per-viewer affect fields aggregate into per-cohort fields under defined aggregation rules, and per-cohort fields aggregate into per-creative or per-campaign fields. The same field algebra runs at each level. The inventive step is the structural specification of affect as a governed, deterministic, persistent control field rather than as a per-event classification, with the asymmetric-update, governed-decay, and cross-coupling properties as conditions for the field to qualify as affective state in the primitive sense. A platform that produces per-event scores and stores them is not running affective state; a platform that maintains the field across events under those conditions is.

4. Composition Pathway

RealEyes integrates with the AQ affective-state primitive as the upstream signal source feeding governed affect fields rather than as the analytics endpoint. What stays at RealEyes: the webcam-capture pipeline, the facial-coding classifiers, the attention-detection models, the panel infrastructure, the calibration work across hardware and demographics, the creative-testing workflow, and the entire client-facing relationship with advertising and media customers. RealEyes' investment in measurement quality — classifier accuracy, demographic coverage, panel logistics — remains its differentiated layer and is exactly the input that the affective-state substrate needs.

What moves to AQ as substrate: the per-viewer, per-cohort, per-creative affect fields that persist between viewings and evolve under governed dynamics. The integration points are well-defined. RealEyes' per-frame classifier outputs are emitted as observations into the affect-field update layer rather than terminating at the per-session report. The update layer applies asymmetric update rules to the affected fields, governs decay between viewings, and exposes the field state as the substrate against which the next viewing is interpreted. Reports to advertisers shift from "this creative scored X engagement on this panel" to "this creative moved brand warmth by Y from a baseline of Z, with attention sustained against a fatigue load of W, projecting trajectory T under continued exposure cadence C."

The new analytical surface is campaign-arc design rather than copy-test scoring. Sequencing decisions become trajectory decisions: which creative goes to which viewer at which point in their accumulated affective state, what exposure cadence sustains warmth without crossing fatigue threshold, when novelty pressure is high enough that a new execution outperforms a repeat. These are decisions current per-session analytics cannot inform structurally — they can only suggest by correlation. The substrate makes them computable. RealEyes' commercial position improves rather than erodes: the per-session signal becomes more valuable, not less, because it is now the input to a layer that converts it into trajectory analytics no competitor in the per-session category can match without an equivalent substrate.

5. Commercial and Licensing Implication

The fitting arrangement is an embedded substrate license: RealEyes embeds the AQ affective-state primitive into its measurement platform and offers trajectory analytics as a tier above per-session scoring, sub-licensed to its advertising and media customers as part of an enterprise subscription. Pricing aligns with how brands actually consume measurement — per-tracked-viewer-trajectory or per-campaign-arc rather than per-creative-test — and creates an annuity-shaped revenue layer above project-based copy testing. What RealEyes gains: a structural answer to the "you only score sessions" critique that competitors and clients increasingly raise as multi-touch attribution and sequenced-creative campaigns dominate digital media planning, defensible differentiation against Affectiva/Smart Eye and the eye-tracking-fused entrants by elevating the architectural floor from richer-per-session to persistent-across-sessions, and a forward-compatible posture against measurement standards bodies converging on cross-touchpoint emotional-impact frameworks.

What the customer gains: viewer-level emotional trajectory rather than panel-level snapshot averages, campaign-arc design tools rather than per-creative diagnostics in isolation, and decay-governed attribution that survives the gaps between exposures rather than collapsing to last-touch heuristics. The chain belongs to the advertiser's data taxonomy, not RealEyes' database, so trajectory history is portable across vendors — paradoxically making RealEyes stickier because its measurement quality is what differentiates its access to a substrate the customer values. Honest framing — the AQ primitive does not replace audience-emotion measurement; it gives audience-emotion measurement the persistent substrate it has always needed and never had, converting a commodity-trending classifier business into a defensible analytics platform anchored on a structural property no competitor replicates by adding signals.

Nick Clark Invented by Nick Clark Founding Investors:
Anonymous, Devin Wilkie
72 28 14 36 01