Glovely

Smart glove-enabled app for real-time sign language translation and learning.

Overview

The sign language app Glovely, integrated with a Bluetooth glove, bridges the communication gap by converting sign language into spoken words, making interactions more accessible for both signers and non-signers.

The aim is to create a mobile application that connects to and updates with glove usage in real time to assist with both live communication and learning, ensuring inclusivity by allowing users to learn kinesthetically or just visually.

Role

UX/UI Designer, UX Researcher, Project Lead

Duration

22 weeks. January 2024 - May 2024

Tools Used

Framer, Adobe Photoshop, Adobe Illustrator, UserTesting.com, Trello
Framer, Adobe Photoshop, Adobe Illustrator, UserTesting.com, Trello
Framer, Adobe Photoshop, Adobe Illustrator, UserTesting.com, Trello

Define

Problem Statement

The absence of a versatile sign language communication tool hinders the deaf/non-verbal and hearing community's ability to communicate and learn effectively, which this glove aims to solve in an innovative and inclusive manner.

How might we design a digital tool that allows users to communicate and learn through feedback in real time?

Empathize

Primary Persona

Directed Storytelling

Our team conducted 4 directed storytelling interviews using storytelling prompts with semi-structured follow-up questions. Participants were asked to share their experiences in a narrative format, from learning a new language to moments of communication challenges or successes, while pausing to reflect on how they felt along the way.

Research Objectives:

  • Understand how individuals have experienced communication barriers in various contexts
  • Identify common strategies people use to overcome or adapt to those barriers
  • Explore emotional responses tied to those experiences (e.g., frustration, confusion, confidence)
This chart organizes the key themes, their corresponding codes, and direct quotes from participants to highlight recurring experiences and user needs.

Empathy Mapping

Our team additionally conducted secondary research, utilizing several informative video resources to inform our understanding of day-to-day interactions.

Ideate

User Stories

Building on these user needs, our team then crafted user stories that captured a range of personas' goals and challenges, from beginner ASL students to healthcare workers. This helped prioritize features like visual/audio-focused interactions, goal-setting systems, and glove sensitivity controls for accessibility.

As a someone new to ASL, I want to explore interactive and visual learning tools in order to better understand signs and their context.

As a hearing doctor, I want to practice signing specific medical topics in order to speak clearly with my deaf patients.

As a beginner hearing BSL student, I want to be receive rewards for practicing in order to stay motivated.

As a non-speaker beginner, I want to rely on visual and audio cues in order to practice ASL without needing verbal interaction.

As a user sharing a smart glove, I want to use my own account in order to track my personal progress.

As a person with arthritis flare-ups, I want to adjust glove sensitivity in order to ensure the glove doesn't pick up unintentional movements.

Mind Mapping

Our team then created a mind map organize key areas of opportunity and better understand the landscape of sign language learning and communication. This helped uncover major focus areas—like learning needs, motivation, community, inclusivity, and social interaction—and visually connect how they influence one another. It also provided a strong foundation to prioritize features and ensure the app would support users holistically.

Low-Fidelity Wireframes

More Sections Coming Soon!

© 2024 Melisa Tasel

© 2024 Melisa Tasel

© 2024 Melisa Tasel