Y.M.

Interactive Yoga Mat (Y.M.)

Overview

The project, “Interactive Somatic Yoga Mat,” explores how multi-sensory experiences can enhance accessibility and mindfulness during yoga sessions. Designed to support users of all abilities, the mat integrates visual, auditory, haptic, and kinesthetic feedback to foster somatic engagement.

  • The yoga mat incorporates features such as Bluetooth connectivity, vibration motors, LED lights, and a session timer to guide users through poses.

  • The goal of the design is to support inclusive movement by providing sensory cues that help users track hand and foot placement, session progress, and pose transitions—making yoga more intuitive and accessible for all.

Role

User Experience Designer

Focused on interaction design, somatic experience integration, prototyping, and testing.

Teammates

Reshmi Akanda, Tahaan Da Silva Adria Porter, & Zehra Noor

Duration

4 Months

Design Process

Our UX process for the Interactive Somatic Yoga Mat began with user and market research, where we identified accessibility gaps and user needs related to somatic experiences in yoga. From there, we defined user needs and created journey maps to outline key touchpoints across visual, auditory, haptic, and motion-based feedback. At the decision point, we assessed which somatic modality—visual, auditory, or tactile—should be prioritized to ensure inclusive design. This led us to prototype sensory feedback mechanisms, including vibration motors and LED lights for real-time pose guidance. In parallel, we tested and evaluated the prototype with users to assess its usability, clarity, and comfort. Based on insights gathered, we moved to refine the features, improving the functionality and intuitiveness of the mat. Finally, we reached the final iteration, where all components were integrated into a cohesive, user-centered design solution.

Understanding the Problem

The project was initiated to improve accessibility and somatic engagement in wellness routines, specifically yoga. Many yoga products overlook users with sensory impairments or those who benefit from multi-sensory guidance, making it difficult for everyone to fully participate in home or group sessions.

  • This project aims to support users who require more than visual or verbal cues to follow yoga poses, including those who are hearing-impaired or need tactile reinforcement.

  • Our solution integrates simple, intuitive signals—vibrations, LED lights, and Bluetooth audio—to guide users through their practice without requiring constant instructor presence or external reminders.

Target Audience

Our primary audience includes adults aged 18 to 45 who practice yoga or fitness at home, in studios, or with community groups. We focused on making the experience accessible to both beginners and experienced users who seek adaptive or inclusive features.

  • Many individuals do yoga for physical wellness, stress relief, or recovery, and often engage with mats that lack smart or sensory feedback.

  • For users who are deaf, hard of hearing, or neurodivergent, traditional cues can be limiting. This project centers their needs by incorporating multimodal sensory input.

  • Since yoga is commonly practiced in shared spaces or via remote guidance, it was essential to create a mat that can function independently while syncing to personal devices.

Goal

To develop an inclusive yoga mat that enhances somatic awareness through technology—enabling users to track time, pose transitions, and progress using a combination of haptic, auditory, and visual feedback.

IoT Connection

The interactive yoga communicates with IoT devices such as an Arduino board, LED lights, bluetooth and sound. We will be conducting an all together electronic component that consists of the physical mat to the digital concept. Depending on the position, we plan to have the mat vibrate when someone moves their hands or feet. This product establishes sensory feedback to improves the user experience and connects devices like the mat and Bluetooth.

Wireframes

I first presented the idea visually through sketches, exploring how the application would guide users through the process. I analyzed several existing applications to understand typical user flows and considered what actions a user would take when first launching the app. One key challenge was designing how group members would owe the individual using the app to pay — ensuring this flow was clear and intuitive. Sharing my sketches with others helped clarify this process.

After conducting a cognitive walkthrough with users, I gathered questions and feedback to identify improvements and gauge satisfaction. I tested the flow with friends who frequently dine out in groups, which provided valuable insights. One major improvement was the suggestion to add a reminder feature, helping users track who owes whom after the meal — a significant finding that influenced the redesign.

Through this process, I learned that both successes and failures play an important role in refining the design and improving the overall user experience.

Final Prototype

I used Figma to create the medium-fidelity prototype, transforming my original sketches into a digital version. During usability testing, users asked questions about the flow and suggested improvements. One important change was redesigning the confirmation page to make it feel more complete and clearly show the split results.

Another key update was changing how tips and taxes are displayed — switching from dollar signs to percentages, based on user feedback. I also simplified the process by combining the payer, item list, and total amounts on one screen, making it easier to review before confirming.

This feedback helped me refine the app into a cleaner, more intuitive experience, as shown in the final prototype. I also learned the importance of preparing well for user testing and how valuable small adjustments can be in improving the design.

Slide Deck