Elomia's Empathy Resets Every Session
by Nick Clark | Published March 27, 2026
Elomia addressed a real access problem: millions of people need mental health support and cannot get it. The platform provides empathetic, CBT-informed conversation through an AI agent that is available around the clock. But Elomia's empathetic model of each user is reconstructed from prior conversation data rather than maintained as persistent affective state. The agent remembers what was said. It does not remember how it felt about what was said. Resolving this requires affective state as a deterministic control primitive with governed temporal dynamics.
What Elomia built
Elomia's contribution is primarily one of access. Mental health support that requires scheduling, insurance, and geographic proximity excludes most people who need it. Elomia provides an always-available conversational agent trained on therapeutic frameworks that can deliver supportive interaction, mood tracking, and guided exercises at any hour. The system handles crisis detection, provides appropriate escalation paths, and maintains conversational quality that users report as genuinely helpful.
The architecture uses language model generation conditioned on therapeutic training data, user history, and mood assessments collected through structured check-ins. Each session begins by reviewing recent interactions and the user's current self-reported state. The agent adapts its approach accordingly, offering different interventions based on what the user reports feeling.
The gap between remembered facts and felt continuity
The structural limitation appears in the texture of longitudinal care. A user working through a difficult breakup over six weeks generates an emotional arc with specific dynamics: the initial shock, the cycling between anger and sadness, the gradual acceptance interrupted by setbacks, the slow rebuilding of emotional baseline. A human therapist who has tracked this arc enters each session with a felt sense of where the patient is emotionally, not just a summary of what they reported last time.
Elomia enters each session with retrieved facts. The user was sad last Tuesday. They reported anger on Thursday. They seemed better on Saturday. These facts inform prompt construction but do not constitute persistent emotional state. The difference matters because emotional trajectories have dynamics that factual retrieval cannot capture. Sadness that has been slowly intensifying over three weeks despite positive self-reports suggests a different clinical picture than sadness that spiked once and has been declining.
Users experience this gap as a subtle but persistent sense that the agent does not truly know them emotionally. It responds appropriately to what they say in the moment. It does not carry forward the emotional weight of previous sessions in a way that shapes its current posture.
Why mood tracking is not affective state
Elomia includes mood tracking features where users rate their emotional state on scales. This data is valuable for self-reflection and for identifying trends. But it is the user's self-report about their emotional state, not the agent's persistent model of it. These are fundamentally different things.
A persistent affective model maintains multiple emotional fields that evolve according to computational rules: distress decays at one rate, trust accumulates at another, engagement fluctuates based on interaction patterns. The model updates not only when the user explicitly reports their mood but continuously based on conversational signals, session frequency, response patterns, and the interaction of multiple emotional dimensions. Self-reported mood is one input to this model. It is not the model itself.
What persistent affective state enables
With affective state as a deterministic control primitive, Elomia's agent would maintain named emotional fields for each user that persist between sessions and evolve according to governed rules. When a user who has been gradually improving suddenly cancels two sessions, the agent's concern field increases not because of a rule that says cancellations are concerning, but because the engagement field dropped while the vulnerability field was still elevated, and that combination triggers a state transition in the agent's therapeutic posture.
The decay dynamics are particularly important for mental health applications. Acute distress should decay relatively quickly in the absence of reinforcing events, while underlying vulnerability decays slowly. Trust accumulates gradually and is damaged rapidly. These asymmetries, encoded as field update rules, produce an agent whose emotional model of the patient evolves realistically rather than resetting to whatever the last session's summary suggested.
The structural requirement
Elomia solved the access problem. The remaining gap is not about availability or conversational quality. It is about the depth of therapeutic continuity that the architecture can support. Persistent affective state with governed decay, asymmetric update, and cross-field coupling gives the agent a model of the user's emotional trajectory that evolves continuously, not just during active sessions. This is the structural primitive that separates empathetic conversation from therapeutic relationship.