Affective Computing
2 postsWhat Is Affective Computing? How Are Emotions Shaping the Systems We Build?
An introduction to affective computing — what it is, where it came from, and why understanding emotions is becoming central to the way we build interactive systems.
How to Build an Emotion Recognition System?
A walkthrough of what it takes to build an emotion recognition system — from choosing which affective states to recognise, to selecting sensors, designing data collection, and labelling the ground truth.
Physiological Computing
3 postsWhat Is Physiological Computing? How Are Body Signals Used in HCI?
An introduction to physiological computing — how researchers are using signals from the body to understand what users feel, how they respond to stimuli, and what that means for the systems we build.
The Sensors Behind Physiological Computing
A breakdown of the main physiological sensors used in HCI research — what EDA, EMG, PPG, and EEG measure, how they work, and how to choose the right one for your study.
Processing Physiological Signals — From Raw Data to Meaningful Features
A practical look at how raw physiological signals are cleaned, processed, and transformed into features ready for analysis or machine learning.
Human-Robot Interaction
2 postsThe Uncanny Valley — Why Some Robots Make Us Uncomfortable?
The more human-like a robot looks, the more comfortable we feel with it — until it gets too close to human and something feels off. A look at the uncanny valley effect and why it matters for robot design.
How Much Control Should a Robot Have? Understanding Levels of Autonomy
Robots range from fully human-controlled to fully autonomous, and where they sit on that scale changes everything about how we interact with them. This post looks at the different levels of robot autonomy and why finding the right balance matters.