Fitness Furniture

UX Research

Overview
A generative system using p5.js, LLMs, and parsed policy + heritage archive data to visualize the tension between lived and legislated definitions of culture through Tanzanian Kitenge prints.

Timeline

Ongoing (Prototype completed)

My Role

Creative Technologist & Critical Systems Designer

Tools

p5.js, HTML/CSS, JavaScript, OpenCV, Python, MongoDB, Firebase, Large Language Models (LLMs), Node.js

Teammates

AI/ML Engineers ②
Data Visualization Specialist ①, Cultural Anthropologists ②
Policy Analysts ②

WHAT IS A KITENGE?
MID-FIDELITY PROTOTYPE

*Please interact with the screen below:

*Click to the report below to flip through it!
Overview
Investigating the opportunity areas for fitness furniture as a gym alternative in Berkeley, California.

Timeline

Ongoing (Prototype completed)

My Role

UX Researcher

Tools

FigJam, Airtable, NVivo, Dovetail, Statista, Mintel, Google Scholar, Notion.

Teammates (left to right)

Hongxi Pan
Sharon Zhao
Darlene Chen

THE DREAM TEAM!

Overview
A scooter add-on that disperses seeds while you move—bringing greenery to urban spaces and beyond.

Timeline

Ongoing (Prototype completed)

My Role

Product Designer & Fabrication Engineer

Tools

Autodesk Fusion 360, Arduino IDE, Figma, Ultimaker 3D Printer,  Adafruit Feather M0 WiFi, P

Teammates

Elizabeth Sun
Sasha Suggs

Context
In this project, I explored how personal transportation can actively give back to the environment. Merging my fascination with sustainable tech and creative problem-solving, I co-designed a seed-dispersing device for scooters - planting seeds and transforming routine journeys into green adventures!
DEMO
FINAL DESIGN
STEP 1
Clip your planter on any ferromagnetic surface on your scooter.
Step 2
Pour your seeds into the nozzle that's release holes work best with the size of your seeds.
Step 3
Twist the top of that nozzle until you reach optimal seed flow.
STEP 4
Follow the map on your EcoPlanter app and enjoy planting!
REFLECTIONS AND NEXT STEPS
Working on Ride & Regrow taught me the importance of a methodical, iterative design process. I learned that each stage—from initial sketches and cardboard prototyping to 3D printing and electronics integration—provides essential data that refines the final product. I faced specific challenges, such as calibrating the fan-assisted seed dispersal system for consistent output and integrating reliable real-time data tracking with the Adafruit Feather M0. Balancing functionality with an appealing design required careful attention to both user feedback and technical constraints.


Next Steps:


Data Tracking Enhancement:
Integrate GPS heatmaps and additional IoT sensors to improve the measurement and visualization of seed dispersion.

App Refinement:
Upgrade the Figma prototype by incorporating machine learning algorithms for more accurate route suggestions and better gamification features.

Expand Collaborations: Initiate discussions with local government and sustainability organizations to explore practical applications and broaden the project’s impact.

I'm so happy to have worked with the team that Ididin developing this. Thank you to Elizabeth Sun for your detailed design insights and Sasha Suggs for ensuring precise data tracking and app integration.
WHAT DID I BUILD?
I built a broom that turns your movement with it into real-time poetry and music. The broom analyzes how fast, hard, and rhythmically you move it, then uses sentiment analysis to create art that reflects the mood of your movement.

Timeline

Ongoing (Prototype completed)

My Role

Product Designer & Fabrication Engineer

Tools

Autodesk Fusion 360, Arduino IDE, Figma, Ultimaker 3D Printer,  Adafruit Feather M0 WiFi, P

Teammates

Elizabeth Sun
Sasha Suggs

Why did I BUILD IT?
There are two main reasons I undertook this project:

1. Inspired by Don Norman’s The Design of Everyday Things and its critique that modern interfaces engage less with our bodies, I wanted to create more embodied interfaces that involve physical movement.

2. Reflecting on language accessibility, I realized that people who cannot communicate verbally often have to learn alternative languages (like ASL or fingerspelling). These methods can be limiting, as they require the audience to understand the same language.


These reflections led me to my primary question/exploration:

What if language could be whatever the communicator desires, with technology translating it into familiar and accessible outputs?
Breaking this down
To explore this question, I built a broom device that, when used (and ideally danced with), generates music and text corresponding to the sentiments expressed by its movement.

So, why a broom?

My Process

1. 1. form - affrodances
To build this broom, I reimagined several parts of a standard design to house the electronics necessary for data collection. I identified the top of the broom handle as the ideal location since it was least likely to be interfered with by the user compared to the bottom.
FINAL RESULTS, reflections and next steps
The outcome is a broom that can be used as a musical instrument, but one that can also be programmed to translate your movements into written language. More immediately, the broom now produces music and poetry when it is danced with.

This project was not only fun but also an enlightening opportunity to learn about the limitations in communication faced by disabled individuals through participant interviews. I learned that inclusive technology does not isolate its users; rather, it works to provide them with access and presence in the shared spaces around us.

While the project is a successful prototype, I recognize the need for more efficient sensors to handle the real-time data transfers and processing taking place. I also envision this project evolving into a wearable product featuring a more streamlined motion cluster training system and a comprehensive language-to-cluster mapping, which would allow for even more detailed communication.