Intangible Interactions Week 2
Project 1, Part 2
Credits

This project was created by Hal Rodriguez and Sean Zhu, based off a previous project by Hal, for the class Intangible Interactions. We collaborated virtually on the idea and code. Hal built the physical device, and Sean wrote this blog post.

The Idea

In this project, we combine an intangible sensor with a previous project of Hal's, the Color Mixing project.

Prior art. In his Project 1 Part 1 write-up, Hal describes the Color Mixing project, an interactive exhibit that allows young users to explore how primary light colors mix together. The project's main drawback was that users did not understand that that they could interact with the system by bring their hands close to the capacitive sensors without touching them.

Modifications. In our Project 1, we swap out the capacitive sensor with a longer-range distance sensor and add cues that allow users to more easily understand the proper mode of interaction so that they can mix colors effectively. The sensor we're using is the Adafruit VL6180X Time of Flight Micro-LIDAR Distance Sensor Breakout, which uses a light beam to sense objects perpendicular to the sensor between 5mm and 200mm away.

The Build

The build was fairly straightforward. We had a STEMMA (?) connector that could directly connect the sensor board to the main board, with no soldering or breadboard needed. The main board also included its own LEDs. But a few things needed to be adjusted.

Using a single sensor. The original Color Mixing project had 3 capacitive sensors, one for each primary color, and was designed for a different person to operate each sensor, simultaneously. Not only do we have one sensor to work with, but the pandemic has made multi-person interactions impractical. So for demonstration purposes, we decided to build this project with only one variable primary color, and have the other colors be of fixed intensities, as if the sensors were held in place by someone else.

Color choice. We originally decided to leave the blue value variable, set the red value to 100%, and set the green value to 0%, so that the overall color would transition from red to violet as the user's hand moved closer to or further from the sensor. In practice, this color change was hard to perceive in person, and even harder to perceive over our Zoom video call, as the camera kept adjusting the color balance in response to the light's color. In the future, we can change the color combination such that the color would change from green to yellow.

Error handling. Another issue we ran into was in relation to error codes. While the sensor would usually emit proper range readings, sometimes it would emit error codes. Sometimes it emitted ERROR_RANGEIGNORE (sensor reading outside of the allowed range), which was to be expected with users' hands moving every which way, but it also emitted some other errors, including ERROR_ECEFAIL and ERROR_SNR (too much noise). We discovered that the sensor was just giving fairly low-level readings, and it was safe to ignore the errors.

Adding cues and feedback. The purpose of using the LiDAR sensor instead of the capacitance sensor so that the system would start to sense the user's hand even when they were fairly far away, but that should not be a substitute for clear messaging and affordances that show the user how the system should be interacted with before the user even begins to interact with the system. We decided to add a ring of LEDs that clearly indicates where the hand should be placed, oriented the output "color meter" to be in the same direction as the direction of the users' hands, and added sound feedback to make the response to the user more explicit. Some other ideas that we considered, that could be helpful in future iterations, are having a printed sign in front of or below the system with hand-placement guidelines, and a laser that illuminates the proper axis of motion for the user's hand.