Alexia Yang
Alexia Yang
Designer | Los Angeles, CA

3D Face Scan Interaction


About the company

Metamason is a health technology startup developing a mobile platform that allows clinicians and respiratory therapists to take 3D scan data of patients’ faces and create a truly personalized 3D printed Continuous Positive Airway Pressure (CPAP) masks.


Interaction design, user interface design


May - October, 2017

context & CHALLENGE

Metamason’s ambitions were to create custom on-demand CPAP masks to address the fundamental fit problems found in standard mass produced masks. Launched in 2016, Metamason developed a innovative scanning application that captures 3D face scan data and generates masks on-site. The application was functional but had a lack of thoughtful user experiences and critical interactions. In May 2017, I joined the team as a Product Designer, working alongside a small team consisting of a Backend Developer, Frontend Developer, Industrial Designer, QA, and the CPO. Being completely new to the CPAP industry, I took the initiative and spent the first month meeting with the Marketing Strategist, who had an extensive history with medical devices focusing on sleep apnea, in order to learn more about the market, product, and the workflow of sleep lab clinicians and Durable Medical Equipment (DME) practitioners. My goal was to design an intuitive scanning interface and user experience that would ensure that respiratory therapists and respiratory technicians of any skill level would be able to take an accurate scan of their patients’ faces efficiently and reliably.


design process


— discovery —


1. Understand the Requirements



One aspect of Metamason’s business strategy was to provide participating DMEs and Sleep Labs with a specialized 3D scan capable tablet, pre-installed with the Metamason App. Using this tablet, respiratory therapists and DME technicians would perform 3D scans of the patient’s face.

Generating usable scan data required two key points:

1. The sensor had to be pointing at the subject within a certain minimum distance.
Tracking was only successful when the object to be scanned is in view and at a relatively consistent distance. Pointing the sensor away too far from the subject or varying the capture distance too much would lead to reconstruction errors.

2. Movement around the subject had to be done in a slow and steady motion.
Erratic or fast movement of the camera would lead to blurry, inaccurate data.


usability test

To help me better understand the scan function, I conducted a test with my colleagues to study their gestures and and movements while performing a 3D scan.


Each colleague was tasked with scanning the face of another using the scanning app.


With this early iteration of the app, only 2 buttons were available on the screen during the scanning process, one to start the scan, and another to stop it. Without any on-screen guidance, my colleagues had a difficult time keeping the subject’s face centered and the scan distance consistent, even in spite of their familiarity with the scanning process. Furthermore, my colleagues tended to capture data haphazardly and were then forced to visit and revisit previously scanned areas in order to to search for gaps in the scan data.



2. Comparative Usability Testing


In addition to the usability test I performed on Metamason’s app, I tested two other apps with similar scanning functionalities on my colleagues in the hopes of gaining deeper insight into how people typically interact with capturing objects on these devices.


Fyuse - 3D Photos


Each colleague was told to follow the instructions the app provided and to take a 3D photo of a selected object.


The first of the apps I tested, Fyuse, relied heavily on its graphical interface to ensure that users had achieved full coverage of their subjects by displaying a visual guideline that users would have to follow along with a crosshair. I found, however, that people became overly fixated with keeping the camera trained on the guideline line.

I was trying too hard to trace the line... I wasn’t even looking at the mug (the subject). The process took too long, so my hands got tired.
I am trying to follow this line... It’s hard!

Google Photo Sphere


My colleagues were instructed to follow the onscreen prompts and to take an image of anything around them.


By contrast, Google Photo Sphere had a very minimalist UI: a small viewfinder with a centered circle, surrounded by a frame with four dots indicating the different directions the user is supposed to move their camera. It was unclear to people that they needed move the camera to find the next dot and repeat the same interaction until full coverage was achieved.

I think I’m done... (Unsure) I don’t see any more dots. (The resultant, stitched image ended up having many gaps in coverage.)
(Pausing after the first shot)... Where am I supposed to go? (Randomly moving the camera around.)

— user flow —


The Scan Process

In addition to the regular scan procedure, I collaborated with the QA and the Back-end Developers to create a list of potential technical errors in order to include them into my detailed scan interaction flow. Since Metamason was in such early stage and did not have a team of customer service in place, I wanted to provide simple instructions to users so they could understand what they can or can’t do, or even how to fix things if problems arose, such as in the case of a camera connection error or other hardware related issues.


— design —


Scanning Guide & Interaction


Based on the user feedback from interacting with Fyuse and Google Photo Sphere, I learned that interestingly, visual guidance was a distraction, but that without it, it was easy to get lost in the process.

It became apparent that the ideal solution to the problem lay somewhere in between the two extremes. Working closely with the developers and taking the limitations of the hardware and software into consideration, we honed in on a solution by repeating a three-step process of design, review, and then test. The following are the solutions we achieved.




A frame to stay focused

Keeping the camera pointed at the subject while maintaining a certain distance between the camera and the subject at all times was critical, but difficult to do while simultaneously moving the camera. My solution was to use a dark frame with an oval-shaped cut-out to help users stay focused on filling the frame with the subject’s entire face. This helped ensure the camera was pointing at the right direction within a certain distance. This frame would appear from the first step of the scanning process, when facial recognition is being performed, and would continue to be shown throughout out the entire scan procedure.

An early prototype of the frame, which was a simple overlay on a regular video camera app.

An early prototype of the frame, which was a simple overlay on a regular video camera app.




A path to follow

In order to have a smooth, responsive app, the team decided to turn off the real-time 3D mesh generation feature, which meant users were not able to see scan results in real-time. In order to ensure full coverage, especially of the critical nasal area of the subject’s face, I was inspired by design of Ikea’s floorplans, in particular the one-way route that steers shoppers though all departments of the entire store. With that principle in mind, I designed a scanning path to ensure every angle of the nose and the surrounding area was captured in one continuous motion.




3D Graphical Guidance

The next step was to think about how to guide the user throughout the scan path. I designed a simple instructive on-screen element in order to clearly communicate the motions the clinician was supposed to make in order to ensure full coverage. Along with the on-screen text instructions, a 3D representation of the camera moved around in 3D space to demonstrate exactly how to move the camera around subject.


After several testing multiple locations and sizes for the instructive element, I decided to have the graphical guidance just to the left of the frame, where it was simultaneously not pulling the user’s focus away from the subject while also being in a place where the user can quickly look to should they need guidance.

Graphical guidance explorations

Graphical guidance explorations




Real-time feedback when needed

As camera position and speed was being tracked relative to the patient’s face in real-time, I designed the frame to highlight in blue and provide a short message to call attention to the user when they were moving in wrong direction, moving too fast, when the the subject’s face was off center, or when the camera’s distance was outside of acceptable bounds. The blue highlight was just enough to get the user's attention without a strong sense of warning, and rather than telling what the user what they were doing wrong, the messages strove to suggest how to adjust their behaviors in order to generate better scan data.

Pre-Scan_too close.png
Scan_speed warning.png
Scan_direction warning.png



Scan Examination Tutorial

After the scan was complete, a built-in scan visualization allowed users to verify scan quality before uploading it for print. I designed a tutorial that included an introduction of the multi-touch gestures used to control and view the scan results and demonstrations of how to examine them based on visual examples.

Reconstruction 1 - tutorial.png
Reconstruction 1 - tutorial Copy 3.png
Reconstruction 1 - tutorial Copy 4.png

— Result —


— REfelction —


Ultimately, the Metamason application was shelved due to complications related to FDA approval. However, I learned many valuable lessons throughout the process. Previous to Metamasion, many of my projects were concepts, or macro designs that attempted to solve problems without fully evaluating product requirements.

Metamason gave me the opportunity to be part of the entire product development process, collaborating and working closely with our Engineers, QA and Industrial Designer from beginning to the end. Additionally, the Metamason team was small and so we were able to operate in a truly agile environment. As challenges arose in the design process, I was able to brainstorm and collaborate with every member of the team, taking their feedback and ideas and integrating them into my work as we progressively worked towards an ideal solution for a process that existed only on the bleeding edge of medical technology.