| Interface Lab

I/O Project: Smart Koozie

Cover Image for I/O Project: Smart Koozie


Come up with a simple application using digital and/or analog input and output to a microcontroller. Make a device that allows a person to control light, sound, or movement using the components you’ve learned about (e.g. LEDs, speakers, servomotors).


In college, my roomate Alex and I came up with the idea for a smart glass that could be used by waitstaff to track customers‘ drink levels. Way back in our first week of Interface Lab we pitched some fantasy projects, and I picked the smart glass. After some really productive group brainstorming, we refined the idea into more of a koozie situation. I decided to run with this idea for our I/O project.

Fantasy device brainstorming miro

My plan was to use an force sensing resistor (FSR) to detect the weight of the drink (minus the glass itself). This data would be processed, fed into a browser for display in real time. Through this process I learned a lot about analog sensors, data smoothing, serial communication, as well as 3D modeling and printing.

Modeling and Printing

Over the past couple of weeks I‘ve been slightly obsessed with the 3D printers in the lab. I figured this would be a great opportunity to test my skills and learn more about 3D printing and modeling in the process. My idea had a few pieces that needed to fit together, so when choosing 3D modeling software, I optimized for features that emphasized precision and ended up with Autodesk‘s Fusion 360.

Final models captured in Fusion 360

The FSR sits in the middle plate which is dropped into the hollow sleeve. There‘s a portal at the bottom of the sleeve for the sensor‘s headers to poke out. The piece on the right serves to concentrate the weight of the glass and drink onto the FSR in the middle.

Printed elements from model

I was really happy with the end result. Although this was relatively simple to print, the modeling got pretty intricate. Spending a lot of time in the modeling software paid off when it came time for assembly.



The software on the arduino side is relatively simple, however I learned quite a bit about data smoothing of analog sensors in arduino. Offloading this data processing from the browser to the arduino seemed prudent for scaling the dashboard beyond a single sensor reading.

Printed elements from model

Initially, I attempted to implement a threshold based strategy, only updating if a certain threshold was crossed. However, given the weight of the glass and the sensor size/quality, the reading was particularly jittery. I ended up with a much cleaner and smooth result simply taking the average of the last 100 readings and printing that over time. After that, I mapped the sensor value (minus force due to the glass itself) to a range from zero to one hundred.


I used the serialport library along with standard web sockets to read the value from the arduino and feed it into a little dashboard built with React. Once I got a clean 0-100 value from the Arduino, I built a Glass component that took the percentage as a prop and rendered a glass svg. The liquid in the glass uses CSS transforms to scale according to the liquid percent.

If a value comes along that is far below the 0-100 range, that means the glass itself has been removed from the Koozie. This info was also passed along to the glass so the SVG could be dimmed.


I‘m really happy with how this project came together. I was able to break down the requirements into discrete problems that felt manageable. There‘s a lot of new concepts here for me: Arduino, 3D modeling, serial communication, to name a few. It was really nice being able to incorporate existing strengths in web tech to complement and build off these new skills. This project pushed my horizons quite a bit, it‘s certainly pushed the boundaries of what I‘m capable of creating. I feel like I‘ve gained so many new tools to draw on for creating interactive experiences.

| Interface Lab

Interface Lab 1

Cover Image for Interface Lab 1


The Digital Input and Output was relatively simple, although I ran into a few issues familiarizing myself with the breadboard and an odd Arduino software issue. The goal was to build a circuit where by pressing a button would send a signal to the Arduino which would toggle power between two LEDs.

The final arduino circuit we were intended to build

The first major issue I ran into was a misunderstanding with my breadboard. I’ve only built with boards half this size, but I knew about the seperation that prevented signal from crossing left to right. However, I wasn’t aware that some double boards are also split vertically. This caused half of my circuit to work as expected while the rest failed silently. I was eventually able to track down the problem by measuring voltage at the Arduino and progressively down through the circuit.

Infographic detailing the issue experienced above

Once I figured this out, completing the circuit on the top half of the board was relatively straight forward.

As a challenge, I wanted to experiment with the 8-ohm speaker and try to create a circuit where pressing the button disabled the LED and caused the speaker to play. I couldn’t get the Arduino tone function to work no matter what I tried. To debug, I learned about Serial.begin and Serial.println for logging out values. I found that the tone function was getting hammered pretty hard. According to the Arduino docs, this should just change the frequency, no big deal. However, I found that adding a 2ms delay to the loop (1ms wasn’t sufficient), resolved the issue. As a result, I decided to use a global variable to track the button state and only call the tone function when the state changed.

Screenshot of arduino code

I made a post on reddit asking for what might be causing this to happen. One commenter suggested, "tone starts an oscillation. If you call it too frequently the oscillation will never occur - like restarting a song before the first note is played." On the other hand, it looks like this is only an issue with Arduino Nanos, another commentor was able to run the code successfully on their Uno.

From that thread, I also learned that c++ functions can have static variables which seems like a super clean alternative to the global variable approach.

| Interface Lab

Sensor observation

Cover Image for Sensor observation


Take a walk around your neighborhood, or a different one, or imagine a walk you’ve done routinely. Take a count of every interaction with a sensor you see or experience.


This assignment made me miss being at home with my wife, Raven. Our walks are filled with observant takes on the world around us. She's incredibly good at picking out the most precious, clever things you'd pass every day without seeing.

Sensors identified

Sensing sensors and Sensing sensors sensing

Many of the sensors give some indication of human interactivity:

  • Elevator buttons light up when pressed
  • Subway and badge entry sensors respond positively or negatively depending on access
  • Apple watch screen brightens and Airpods make a small chime when accelerometer is activated
  • The emergency stop (presumably) releases a alarm when pulled

Other, more passive, sensors don’t explicitly make themselves known:

  • The door sensor on my dorm, silently recording my exit and return
  • The cameras constantly recording my morning journey
  • Earl’s Airtag silently interacting with devices around him

Interaction stages

While most of the sensors and devices and simple interfaces, a few involved more complex interactions. Perhaps the most involved dance of sensor interactivity I experienced was from using my washing machine

  1. Human begins interaction through scanning QR code or using RFID sensor
  2. Machine connects user to device, ensures sufficient funds, then enables buttons for interactivity
  3. Human uses buttons to select their wash preferences
  4. Machine detects wash is complete based on load weight and time
  5. Machine alerts Human with a push notification
  6. Human unloads wash, Machine detects wash unload and resets for next Human