The blind date identifies the strong disconnect between popular dating apps and accessibility, especially for blind users. An evidence-based, universal design solution is provided to ensure blind people have equal access to the large dating pool.
Design - Concept sketches, wireframing, hi-fi voice prototype, usability testing.
Research - Participant outreach, semi-structured interviews, affinity mapping, competitive analysis, expert evaluation.
15 weeks
Aug 2021
Dec 2021
Adobe XD
Figma
Notion
Qualtrics
Boya Ren
Greg Parker
Srijan Jhanwar
Jessie Chiu
Using voice over or screen reader with these applications leads to confusing interactions for blind users as they're not really designed or developed keeping accessibility in mind. The challenge for this project is to come with with a universal design concept that works for both blind and sighted users.
We conducted secondary research and reached out to the blind users of dating apps by 'sliding into the DMs'. After gathering permissions and approvals, we gathered data by using several research methods to inform our universal design recommendations for dating apps.
In recent years, most popular apps have rolled out the account photo verification feature. The absence of any textual description of the gesture, blind users cannot complete this step. This concept provides an improved photo verification experience and allows for audio verification for users with special needs.
Most dating apps are hyper-visual, relying on photos to convey a person's personality, interests, and lifestyle, which greatly excludes the blind community. This concept introduces voice recordings to add stories to profile pictures or add voice responses to fun prompts.
To mitigate the dependency and ambiguity of accepting recommendations from sighted individuals, we incorporated a recommender engine that would build on the existing face recognition technology in photo albums. Users can share these top 10 pictures with family and friends to narrow down on choices.
This project was a part of a semester long research methods course. As a result, the proposed design concepts are grounded in user research. As sighted individuals, we were cognizant of our limited perspective and hence we conducted research using several methods to inform our final design solutions. I took the lead in secondary research, user outreach, semi-structured interviews, visual design, researching and developing voiceover interactions for the proposed design concepts.
We divided the research work into three phases. First was secondary research where we analyzed the top dating apps from the viewfinder of accessibility and usability. Second was user research where we conducted user interviews, cognitive walkthrough and mapped our findings using the affinity mapping model to inform design concepts, and third phase was testing the sketched concepts, wireframes and hi-fi voice prototypes with accessibility experts and blind users.
Apps like Tinder, Bumble, OkCupid do not work well with screen-readers because their buttons are not labeled correctly, and the hyper-visual nature of these apps does not inform a blind person whatever they'd like to know about their potential match. We analyzed these apps to get a better understanding of accessibility issues.
The current dating apps are not developed with conventional voiceover or screen-reader labelling methods. The following image presents what the apps would read when an element is tapped on. This inconsistent interaction is confusing for blind people and they often need physical assistance from their sighted friends/family to be able to use these apps.
We conducted interviews with blind users of dating apps and accessibility experts to gain insight into the scope of the challenges blind users face on online dating platforms, as well as gather information on consideration, resources, and guidelines to look into when designing for the blind community. Our script with questions and rationales helped us stay on track.
We synthesized all the data gathered from our surveys and interviews and organized them based on their affinities, to find patterns and common themes. We used our top findings to inform design decisions and derive design implications from.
We developed three user personas to condense our findings from the surveys and interviews. This helped in gaining empathy for the end user, gaining a perspective similar to the blind community and their relationship with tech, and further defining who our users are, their frustrations and their expectations. We condensed our broad findings into what these three representative users say, think, feel and do. This helped us identify the core issues that needed to be addressed within the timeline.
When kicking off the design phase for this project, we identified and established three distinct design concepts that would be unified by interface and voice prototyping. These three concepts address the core issues a blind person faces when using dating apps. We got together as a team to brainstorm ideas on whiteboards and came up with low fidelity sketches for the screens of each design concept.
Based on the sketches, we developed wireframes with voice interactions and conducted think aloud interviews with blind users to get their initial feedback. We annotated their notes as 'positive', 'negative' and 'could be improved' features for each screen. These sessions ensured that our solutions work with a broad range of blind users with varying levels of tech-savviness.
Providing detailed pose description and alternate audio verification system for profile verification.
Really like the real time feedback of where I am located in the camera frame for composition.
The phrase required for audio verification is too long. I cannot remember the whole thing.
Personalizing profile with audio stories for uploaded pictures and fun prompts.
Would love to both record my voice and listen to other people talk about themselves.
Too many steps to complete the process, would like it if this was easier to do.
Recomending the top 10 photos to users using machine learning and image recognition.
I like that I can still share these top 10 photos with friends without telling them it’s for a dating app.
I don’t trust the AI to select photos for me. I’d still trust people over machines to select what’s best.
After receiving feedback on our low fidelity wireframes, we took the design concepts to Adobe XD as it allowed us to integrate voice interactions for the final prototype. Since our target users are people from the blind community, it was critical for our prototype to offer interactions that are as close to that of a screen reader as possible.
We conducted user testing of our final prototype with two accessibility experts, and four blind users of dating apps. We found out that there are no prototyping tools that are voiceover compatible and could be shipped remotely, hence we conducted the tests using zoom’s screen and audio share feature.
We conducted cognitive walkthrough over zoom where we set aside tasks for each concept. We then judged the performance of our design solution through the viewfinder of these five critical questions.
We gathered our participant’s thoughts on dating app usability before introducing the design concepts and post the cognitive walkthrough to assess their satisfaction levels with the different design ideas.
This project was quite challenging as the blind community is a secured space and getting them to talk about something as private as their dating life was not easy. It taught us the value of empathy, the importance of user research and the skill of connecting with users to understand their frustrations.
I personally think it’s very important to not design for, but design with people with disabilities. Representation is important and collaboration with disability experts results in an efficient, empathetic and a truly ‘universal’ design.