an AI-driven smart glasses that can recognize objects and texts,
and transfers the detected information to the user through audio interaction.
DURATION

6 months

TEAM

Yagmur Erten

Giada Sun

Weixi Zhang

MY ROLE

Secondary Research

Research Design

User Interviews

Data Analysis and Coding

Prototype Testing

Design Evaluation

Sponsor

Introduction

According to WHO, there are around 2.2 billion people in the world with visual impairment or blindness, who live in a world that is mostly designed by and for the sighted people. These people often face daily obstacles while perceiving their environment such as reading printed instructions, sorting mail, and differentiating medicine bottles.

Which is why we wanted to focus on people who are blind and visually impaired and their needs.
039DD02E-BCA1-41AB-B82A-71217DC72F1A-4DC

What is Iris?

an AI-driven smart glasses that can recognize objects and texts, and transfers the detected information to the user through audio interaction.

Objective

Our objective was to understand the online shopping experiences of people who are blind.
How might we provide tractable, effortless, and independent online shopping experiences to people who are blind? 
V2-6,7.png

Research Methods

With the information, I gathered from secondary research and articles, my team and I gained a general understanding of people who are blind people use the Internet. With that knowledge, I spoke with experts in this field to expand our understanding of blind people’s online shopping behaviors. I have conducted interviews with four university professors, who have previously worked in the area of blind people and accessible technology design. From those interviews, I learned that:

  • Blindness is a spectrum, narrow down your target user.

  • Do not assume how blind people use technology.

  • Do no exclude those, who have low technology proficiency.

  • Examine current accessibility guidelines.

  • Start talking with blind users as soon as possible.

Expert Interviews

Competitive Assessment

I researched the space of assistive technologies for people who are blind people that help them with everyday tasks. My goal was to identify already existing services that provide great experiences to blind users to leverage their good qualities in our design solution and avoid accessibility pitfalls that they have fallen into. 

I found out that currently existing computer vision technologies, such as Seeing AI is very helpful to people who are blind while recognizing objects or texts. However, there are still some usability issues, such as requiring users to aim the camera on an object directly and take a picture for proper recognition, which has a steep learning curve.

Group 182.png
a user who is blind, using Seeing AI

For the primary research, I recruited our participants by reaching out to some associates and faculty, as well as to nonprofit organizations that work and support blind communities; such as the UW Disability Center, Seattle Deafblind Service Center, and the American Council of the Blind.

I recruited a total of 10 participants, ages ranged from 19-60, who identify as "completely blind with no vision" and have shopped online in the past 30 days. I did not have any technical knowledge requirements other than being able to use the Internet, as I did not want to exclude participants due to the level of technical proficiency.

Participants & Recruitment

Research Methods

I designed two different research methods to gather data from our participants.

a) Phone Interviews with Directed Storytelling


I interviewed 10 participants on their usage of the Internet and shopping websites. I asked about their experience and stories with online shopping websites whether it is for grocery shopping or retail item shopping.

 

The goal was to collect first-hand anecdotal data of blind people’s experiences, opinions, and attitudes around online shopping. I also wanted to identify any advancements and frustrations that exist within the current technologies that I asked our participants to reflect at times when they could complete online shopping tasks successfully and unsuccessfully. This helped me to analyze their expectations and objectives.

b) Remote Moderated Study


During this study, I re-recruited 5 of our previous participants and asked them to share their screens with us through an accessible-friendly video conferencing tool, Zoom, and requested them to do online shopping exercises as we observed their patterns. I watched our participants' steps as they browsed and selected both familiar and unfamiliar items.

 

The goal was to gain a deep understanding of blind people’s online shopping behaviors and the tools that use and pinpoint any missing features that could ease their process of online shopping.

  • Facebook - White Circle
  • LinkedIn - White Circle
  • Twitter - White Circle

screenshot from the Remote Moderated Study with participant 4

Sense-Making &

Data Analysis

I gathered all of our data on database organization tools such as Airtable and Excel for us to be able to easily access all the information we received. I then organized the collected data with Affinity Diagramming methods on an online whiteboard platform, Miro, as I color-coded the collected information. I compiled our observations and insights into relevant categories and consequential relationships based on shared intent, purpose, or problems.

Miro board screenshot

Findings and Insights

Most of our findings are specific to our users’ online shopping behavior, such as;

  • our participants prefer familiar products and platforms while shopping

  • our participants value conversational assistance even though they believe that the technology is not there yet.

 

However, more importantly, we found out that there are some insights that needed a broader context, which is to get relevant information more efficiently and independently whether while shopping or doing other daily tasks, such as reading instructions, sorting mail, and identifying a product.

What do our users value?

  • Efficiency

During the remote moderated studies, we observe that our participants preferred navigating through websites with very fast-speaking screenreaders.
 

I gather product information from websites by jumping the headers as quickly as possible to do what I want to get done.” -Participant 5

  • Relevance

​Efficiency itself doesn't mean much unless it is to extract relevant and specific information.

"I am only interested in the product and the price. I do not care about other things that are listed.” -Participant 3

  • Independence

​Our users prefer doing things by themselves, if they could, before asking for help from other people; even though most of our participants stated that they would welcome technology as support.

"I live by myself and I do not really like to call someone when I need help. I try my best to do anything by myself with whatever is available.” -Participant 8

Ideation &

Down Selection

With the values of our participants, we brainstormed 60 ideas that can support blind people with daily tasks. We downselecting through dot voting and evaluating ideas based on their viability and feasibility.

Once we understand users’ existing behaviors and needs, we shifted our focus from online shopping to providing independent experiences that can help people who are blind when they have the text and object recognition needs.

Brainstorming exercise

This is the end of the research phase for this project. As the project grows in design, this page will be updated. 

Stay tuned!