Scientific study: detecting and representing emotional states through a neuro‑responsive system
Have you ever felt an intense emotion and struggled to share it with those close to you? Today, most emotional exchanges rely on verbal language or non‑verbal cues (tone of voice, facial expression, posture). Yet there is often a gulf between what we truly feel and what we manage to express… This gap can lead to misunderstandings, frustration and confusion, both in our personal lives and at work. How might technology bridge this divide? The scientific study Neo‑Noumena addressed this question by introducing an unprecedented system combining EEG sensors, machine learning and mixed reality (MR). Quotations in this article are drawn from the references cited in the study by Nathan Semertzidis, Michaela Scary, Josh Andres et al., Neo‑Noumena: Augmenting Emotion Communication.
The context of the study: the stakes of emotional communication
Expressing emotions beyond words
As social beings, emotional communication lies at the heart of our human relationships. Within couples, families or professional settings, the quality of our interactions depends largely on our ability to identify, share and understand emotions.
Yet decoding someone’s emotional state encounters several limitations. Each person filters their own words, sometimes our vocabulary fails to capture the exact feeling, and non‑verbal signals do not always faithfully convey the intensity or precise nature of an emotion.
Faced with this reality, we remain partially misunderstood, insensitive to the subtleties of what another person experiences. What if technology could dispel these obscurities and help us understand each other better?
Existing digital solutions to enhance emotional communication
For years, technology has aimed to close the gap between our inner world and its expression to foster genuine connections. Digital solutions to date include:
- emojis, which extend beyond words by offering a visual palette of our feelings;
- biofeedback devices, such as wristbands that monitor heart rate to indicate physiological arousal (stress, relaxation), yet do not specify the emotion itself;
- emotion‑tracking apps and digital journals, which guide users towards greater self‑awareness of their mental state.
However, these approaches remain partial: they don’t combine real‑time capture of brain activity with a representation that others can readily understand. In that regard, the Neo‑Noumena study offers a breakthrough innovation: continuously translating an individual’s brain activity so that their interlocutor can immediately perceive their emotional state.
This research is part of an emerging trend seeking to integrate biosensory data into everyday communication, not to surveil or diagnose, but to enrich the relationship with oneself and others.

Photo credit : Exertion Games Lab, Neo-Noumena – Emotion BCI.
Method of the study: a multifaceted protocol to measure the emotional aura in real time
The Neo‑Noumena device
Neo‑Noumena is a “neuro‑receptive” system designed to deepen emotional communication between two people. It combines three core technologies:
1. Portable electroencephalography (EEG): two headsets with a total of eight electrodes measure the brain’s electrical activity, capturing subtle variations linked to emotion.
2. Real‑time classification: a machine‑learning algorithm pre‑trained on the DEAP dataset (Dataset for Emotion Analysis using Physiological signals) processes the filtered EEG signal to extract spectral features. A support vector machine (SVM) model then classifies each emotional state into one of four categories:
- high arousal/positive valence;
- high arousal/negative valence;
- low arousal/positive valence;
- low arousal/negative valence.
3. Mixed reality and emotional fractals: each participant wears a HoloLens headset that superimposes a cloud of coloured fractals around their head. The fractal colours and shapes, accompanied by sound, correspond to one of the four emotional categories. This visual and auditory rendering opens a window onto oneself and one’s partner’s emotional state, beyond words and facial expressions.
Why choose mixed reality rather than virtual reality? MR overlays virtual elements onto the real environment, preserving the social and spatial context of the exchange. Users retain the spontaneity of face‑to‑face conversation while benefiting from an augmented visualisation of the other person’s emotional state.

Photo credit : Exertion Games Lab, Neo-Noumena – Emotion BCI.
The protocol of the study
In this study, researchers recruited ten participants forming five pairs. The average age was 34, with no inclusion criteria beyond an existing emotional bond. Each dyad kept the system for three consecutive days at home, with a single requirement: at least one hour of daily use.
The full protocol included:
1. Psychometric evaluation (PEC): before the first session, participants completed a validated questionnaire measuring ten sub‑scales (identification, expression, comprehension, regulation and use of emotions) in both intra and interpersonal dimensions. After three days, the PEC was readministered to assess any statistical changes.
2. Online journal: participants recorded their session context (activity, time of day), emotional state (before, during, after) and impressions of the match between the fractals and their real emotions. They also noted observations on their partner’s interpretation and any significant events (surprise, discomfort, wonder).
3. Semi‑structured interviews: at the end of the experiment, each participant took part in an individual interview covering user experience, comfort, unexpected insights, impact on the dyad’s dynamic (feelings, empathy, potential conflicts), and device limitations (weight, latency, precision). Interviews were recorded, transcribed and anonymised for thematic analysis.
4. Quantitative and qualitative analysis: PEC scores before/after were compared using paired t‑tests (n=10). On the qualitative side, thematic analysis of interviews and journals identified three dominant themes: awareness of emotional change, a neutral mirror of emotion and a continuous stream of unfiltered information.
Study findings: when the intimate becomes visible
Quantitative analysis
Of the ten PEC sub‑scales, only interpersonal emotion regulation showed a significant improvement between pre and post‑use (t(9) = 3.24, p = 0.01). This suggests Neo‑Noumena enhanced participants’ ability to modulate or adjust others’ emotions, detecting signs of tension and adapting behaviour accordingly.
Other emotional competence dimensions didn’t change significantly in this small sample, though qualitative feedback enriched the understanding of the device’s effects.
Qualitative analysis
1. Spatiotemporal actualization: awareness of emotional flux
The system offers an innovative approach: it materialises an internal activity that was previously invisible and imperceptible to those around us. Rather than struggling to describe our inner experiences, the system shows them:
“It feels like someone’s actively interpreting things that you don’t see to show you a depiction of it” (P8).
Gradually, a constantly shifting painting takes shape, illustrating the ephemeral and variable nature of emotions:
“Still surprised about how frequently the system is suggesting these emotional states change. It is reassuring to know that these states can change and flow so quickly” (P3).
This relationship between time and emotion was deepened by repeated use. Day after day, participants noticed fluctuations influenced by their environment, and even altered their activities deliberately to observe the impact on their emotional state.
2. Objective representation: a “neutral” mirror of emotion
Beyond merely observing emotions over time, the device allowed participants to dive inward. This process emerged through comparing their felt experience with Neo‑Noumena’s display:
“It was more like a mirror” (P4).
The system thus became a means to test hypotheses, confirm intuitions and build confidence in one’s own perceptions. Participants turned into investigators of their own feelings:
“I think I’m feeling pretty good, let’s put on the Neo-Noumena’s for a little and see if I’m actually feeling good or If I’m just tricking myself that I’m doing good” (P7).
Judgment fell away from the emotion itself, allowing it simply to be observed without attachment:
“Even when it was a negative thing being generated it was still really beautiful. It was like you could appreciate like the negative moods just as much as you could appreciate the positive moods ” (P4).
And because users also observed their partner’s emotional state, they became more empathetic and aware of what was happening for the other person:
“It was like a constant visual reminder to consider someone’s mood […] and just appreciate that other people have emotions as well” (P4).
3. Preternatural transmission: a stream of unfiltered information
The system provides information continuously and automatically, becoming a stable, reliable anchor:
“it was like you had an ‘aura’ you could always refer to, no matter what you were doing […] it was always there” (P4).
Beyond facilitating communication, Neo‑Noumena supplies extra data compared to speech or non‑verbal cues alone. When asked, “Wouldn’t it be redundant, since you could just ask someone how they feel?”, one participant responded:
“Nah, because you have to rely on what that person is saying, and they could be just making shit up […] And also you have to rely on your own interpretation of how you’re feeling as well, which might be biased […] so it’s cool to just see it automatically” (P3).
However, interpretative discrepancies did arise:
“oh no, don’t think I’m angry at you please, I like hearing you singing, I just have a headache” (P7).
Such variability underscores the need for a shared framework or an initial calibration phase to align meanings.
Limitations and future directions: towards emotion interpretation 3.0
While emotion transparency holds promise for therapy and personal growth, it raises ethical challenges. Who may access your emotional stream? How do you protect this “emotional privacy”? Some participants felt uneasy at large‑scale exposure or misinterpretation of their emotions.
Users also requested a more diverse emotion palette to capture nuanced experiences. In response, researchers propose modulating fractal size with EEG amplitude or colour with frequency density, while maintaining geometry tied to emotion classification.
Technically, authors acknowledge limitations:
- detection latency (30‑second windows) can affect perceived responsiveness;
- ergonomics of the headset discourages prolonged use.
Future improvements include advanced deep‑learning models, personalised training, device miniaturisation, and adaptation to clinical (therapy, autism) or social (co‑living, emotional education) contexts.

Neuromind device: an innovative neurofeedback solution for emotional measurement
While Neo‑Noumena focuses on sharing your emotional “aura” with those around you, Neuromind concentrates instead on learning emotional regulation to support and assist patients, particularly in cases of chronic conditions such as depression.
Using a headset equipped with EEG electrodes, brain activity is measured to capture the cognitive patterns at work. The Neuromind system thus combines:
- electroencephalography (EEG);
- electrocardiography (ECG);
- eye‑tracking;
- immersion in virtual reality.
Our algorithms detect and classify the collected data according to two proprietary biomarkers of attention and emotion. Beyond simply analysing emotions, Neuromind works in tandem with the VR headset to assess the effects of the virtual environment on the user. It then becomes possible to define a target state, such as relaxation, and have the immersions evolve until the user achieves it.
Neo‑Noumena paves the way for a new form of emotional communication, where the invisible becomes visible and subjective affect is shared in real time. This approach could transform how we perceive, express and regulate our emotions, both for ourselves and for others. At the same time, virtual reality-based neurofeedback holds the potential to revolutionise emotional regulation. If you would like to experience the benefits of our systems for yourself, we would be delighted to arrange a demonstration.
Sources :
- Semertzidis, N., Scary, M., Andres, J., Dwivedi, B., Kulwe, Y., Zambetta, F., Mueller, F. Neo-Noumena: Augmenting Emotion Communication. CHI 2020. Long paper. ACM.
- Semertzidis, N., Scary, M., Andres, J., Kulwe, Y., Dwivedi, B., Zambetta, F., Mueller, F. Neo-Noumena. CHI 2020 Interactivity. ACM.