Real Time UX

Real Time UX ("Real Time User Experience") is a phone case that is capable of sensing when the person using the phone is stressed or frustrated, as a way of automatically generating usability feedback. It is a project that I developed as part of my Master's Thesis at MIT.

This project is ongoing and this page will be updated throughout Spring 2021.

  • Human-Computer Interaction
  • Machine Learning
  • Affective Computing
  • Internet of Things
  • Design Challenge
  • UI/UX designers and mobile app developers have limited ways of receiving feedback on the usability of their products. They can interview or observe users, methods which might not be practical to do at scale or on an ongoing basis. Or they can rely on inferring user experience from metrics like click through rate and heat maps, which require skilled interpretation. I wanted to see if I could build a device that used biometric sensors built into the physical affordances of a phone case in order to passively generate a model of user affect, and in particular frustration. I then operationalized this model to create usability feedback for designers (slow loop feedback) or to change the parameters of the mobile app in real time (fast loop feedback).
  • Project Scope
  • This project comprises the majority of my Master's Thesis for the Integrated Design & Management program at MIT. I began research in Sept 2020 and will complete it in May 2021. I did this project alone, with regular advice and support from my friends and thesis advisor, Tony Hu. The project started with a literature review; two rounds of user research; six rounds of hardware prototyping; embedded, web client, and machine learning programming; and extensive user feedback. Because the COVID-19 pandemic made running an in-person human subjects trial with hardware infeasible, a formal validation of the research hypothesis is out of scope. The results presented are subjective and qualitative.
Psychophysiology is the study of how changes in mental state drive changes in the body.

I attempt to leverage this knowledge in order to improve the usability of mobile apps in real time and make development cycles shorter and more effective.

Much research has been done on the changes that take place in the hands when a subject's emotional state changes. Hands have the highest density of eccrine sweat glands, which are the best way to sense sympathetic nervous system (SNS) activity. SNS activity is associated with our "fight or flight" response, and is a reliable measure of stress. Hands are also a convenient place to measure heart rate and heart rate variability, signals mediated by both SNS and PNS activity. We also emote with our hands directly - squeezing, gesturing, and shaking them to consciously or subconsciously express ourselves. All of these signals together can be used to build a model of the user's frustration, stress, and engagement with a mobile app experience.

Real Time Data Charts Github Code Repo
  • Target UserPhone app beta testers
  • CustomersMobile app UX designers, product managers, developers, and entrepreneurs
  • Type of WorkLiterature Review, User Research, Product Design, Hardware Engineering, Firmware, Web Development, Machine Learning
  • Tools UsedFusion 360, Eagle, C++, Python, Javascript, PHP, Keras, TensorFlow
Design Process

The origins of the project came directly out of my previous work on Allay. I wanted to create an affective agent within the pill box that could sense the user's basic emotional state and use behavioral science nudges to help drive healthy behavior change. In working through what that would mean, I realized that the core idea within that project - sensing and operationalizing user emotional state - was where my interest lay.

This project was a hybrid between engineering research and product design, and therefore followed a slightly unconventional trajectory. I had to balance, and sometimes alternate between, figuring out what was technically possible and what would be desirable for end users. I therefore started by doing a series of open-ended interviews to lay the groundwork for how usability feedback happened currently, and then set out to make sure that I understood what kind of product experience I would actually be able to deliver.

Not having any background in psychophysiology, I had to perform an extensive literature review of prior work done in the field. My review concentrated on two areas that I needed to understand. The first was on what other people had accomplished in trying to use physiology to measure user experience and usability. While not a mature field, I was able to find a variety of attempts to use electrodermal activity and heart rate as a measure of frustration. None of them had been built into a mobile, field operational device, so this research both validated that the technology was likely to function and confirmed that I would be adding something to the field.

The second area of research was into specific methods of measuring user physiology, as well as best practices for determining the metrics that would form the inputs into my machine learning model. Electrodermal activity and cardiac activity have both been studied deeply for over a century, and I was able to find current and readable reviews of the best methods as well as open source tools to assist in feature extraction and analysis, such as EDA Explorer.

With my domain knowledge and literature review in hand, I was ready to start building things. I first built a breadboard prototype to validate the basic performance of the subcircuits I had spec'd before going through the time and expense of creating my first printed circuit board. The initial prototype ran on an Arduino Micro that I had lying around and reported values through a USB cable.

Following this success, I wanted to quickly move to a PCB. Operating in the form factor of a phone case, size constraints were a priority for me and I needed to make sure that everything would fit and I wouldn't run into any issues with data collection. I built the second and third prototypes on the Particle Argon development platform, in order to save myself as much time as possible in reporting data to the cloud. I also started creating 3D printed enclosures to test the ergonomics and physical user interfaces of the case. The early iterations were mostly frames for the electrodes to mount to.

Unfortunately, the Particle infrastructure was too limiting for the amount of data that I needed to send and the frequency with which I needed to do it. I therefore switched to the HUZZAH32, which is an ESP32-based development board from Adafruit and shares the same footprint as the Particle Argon. The fourth through sixth and final revisions focused on signal fidelity, miniaturization, mechanical constraints, and aesthetics. On the mechanical side, I began to optimize the enclosure for ergonomics, ease of assembly, and finally aesthetics.

Around the third iteration of prototyping, I had gained confidence in the type of product I would be able to build and therefore decided to go back and finish my user research. I interviewed approximately 20 product managers and UI/UX designers to learn the tools they currently used to determine app usability, the flaws they saw in these tools, and how they would think about purchasing a new tool to solve these problems. At the end of each interview I showed them the device that I was working on and got their feedback on how feasible it would be to integrate with their workflow.

While I got a lot of excitement and encouragement from users, I also perceived that there was a relatively low willingness to pay for a new solution. Most product teams operate on shoestring budgets and are seen as a cost center by their organizations. Therefore I determined that this product, while useful, would likely see limited success in the market.

This stage of the project is ongoing.

I am in the process of building a machine learning model to translate the biometric data into a model of user affect and frustration. As a minimum viable model, I will use a rule based system that will translate tonic skin conductance and heart rate variability into a measure of stress. This is essentially guaranteed to work based on prior work in the field, but will have a slow reaction time, won't be able to isolate specific frustrating events, and will need to be hand calibrated to each user.

Conventionally in the field, a Hidden Markov Model has been used to model user affect, where latent emotional states are hypothesized to generate observable data. This option has the benefit of a wide body of historical work and are relatively straightforward to train. However, HMMs are nearly stateless; they lack the deep historical context and the large number of training parameters of more modern models like LSTMs (Long-Short Term Memory models). An LSTM model is the most likely to be robust and resilient throughout time as well as different users, and be the easiest to personalize to a given user's physiology.

Both HMMs and LSTMs have the same problem, though LSTMs have it worse, which is the difficulty in collecting labeled frustration data. If I had the time and resources to run a human subjects trial, I would be able to get at least some basic training data. As it is, I will try to train the model only on my own use. Even so, it will be difficult to get enough data, and the data will be polluted by my own expectations for what is frustrating and what isn't. Regardless of the overall success of these methods, it will be a good learning experience and form a platform for any future work that may occur.

This stage of the project will start in May, 2021.

I plan to spend the final weeks of the project testing my device and machine learning model with other users to see how well it can detect user frustration, and the subjective experience of other users with the device. I'm curious if people feel observed or if the passive activity of the device fades into the background. While the ergonomics and form of the device have been somewhat validated already, I'm sure that seeing people handle the final version will still carry many learnings.

Design Features
"Operationalizing Psychophysiological Correlates of Mobile App User Experience"

Conclusion excerpt:

"Human-centered design is, at its core, the process of making the world a better place by improving the experience of all stakeholders. Improving app usability may seem to be a marginal increase in value, providing some gain in efficiency, effectiveness, or entertainment, or in reducing the amount of stress and frustration experienced by app users. As mentioned earlier in the paper, the value of this goal should not be underestimated in all contexts. However, sometimes usability has a greater ethical imperative. RTUX can help flag accessibility problems, or designs that are culturally insensitive. Such issues virtually never arise out of malice or callousness on the part of the app designer, but rather a lack of empathy. RTUX is one small step to try to create pathways for empathy to flow through our technology more easily."