
This project proposed ways to utilize Siri VUI in the Dexcom G7 app to improve diabetes management in visually impaired and abled users.
2 feature concepts — new Siri commands and UI focused on discoverability — were presented to Dexcom UX and stakeholders. As of 2025, our VUI command has been added to the G7 app!
Problem Statement
"How can Siri VUI improve accessibility in Dexcom's new G7 app?"
A group of low-vision and visually abled users was provided to us!
I was tasked with guiding the team towards a tangible solution.
None of us knew anything about voice interactions and working with a visually impaired focus group … especially me!
Research & Users
I analyzed secondary research to understand VUI interactions, potential challenges, and etiquette for a low-vision focus group.
I learned of best practices, strengths, and gaps through literary research and compiling a competitive analysis of 4 CGM competitors in the market.
Then, I used auto-ethnography to personally experience VUI scenarios and prompting structures in daily life.
"Hey Siri, remind me of my 8PM medication!"
Lastly, I conducted 3 (of the 7) user interviews to discover the pains and gains of current VUI app features.
Via Zoom, I delved into how Dexcom G6 users integrated its limited VUI into their daily routines. Then, the only command was to read blood glucose levels out loud!
Insights
I used an affinity diagram to code our data, which we parsed using Otter.ai transcriptions, into 6 key insights that encapsulated experiences with diabetes.
Similar to a user persona with pains and gains, a map identified when and where we could implement or enhance VUI usage.
Ideation & Prioritization
Insights to ideas! After brainstorming, I mapped potential VUI commands and sketched app features by use case.
"What will help users during X part of their routine?—How would it show up for visually-abled users?"
The team shared many concepts, but due to time constraints, 2 app features were prioritized using a matrix evaluating technical feasibility and user impact.
Faced with a 3-week deadline reduction, I prioritized cross-communication as a team with mentors for visibility, ensuring they validated all decisions before the next phase.
Prototypes 1: Core Flows
#1: Siri Prompts for Elasticity
I mapped 1 (of the 2) flows for the VUI command in a flowchart detailing all potential voice options a user could say—error handling and all!
The G6 could only verbalize one number out loud, so the lack of complex data reading was a big pain. Our goal was to convey more complicated graphical information that would benefit all users with in-depth information.
#2: UI Flow for Discoverability
On a time crunch, I aided a high-fi design in Figma, offering onboarding sequences, increased settings options, and user customization capabilities for future VUI prompts.
Users struggled with the discoverability of existing in-app features. Our goal was to improve wayfinding and, once users found it, offer personalization!
Testing & Findings
I led 2 (of the 4) usability tests over Zoom, using a "Wizard of Oz" method to simulate command-&-response flows.
Wizard of Oz: simulates a functional interface by having a hidden human manually generate responses.
Objectives were to evaluate the learnability of new features, identify friction, and gather targeted feedback on key UI and design decisions.
Prototypes 2: Refining
4 key insights led to iterative design changes, evolving both prototypes with more varied information delivery across different use cases.
What would short Siri responses say ? Long? What about UI display for hands-on detection? Hands off?
AKA … more detail team! To haste!!!
We had fully fleshed out VUI commands and a high-fidelity prototype of app features!
Dexcom Presentation
Sprint complete, I led our final presentation to Dexcom's UX team and stakeholders, showcasing our research and 2 innovative G7 features.
The enthusiastic feedback and congratulations highlighted our impactful work within such a short time!
"I'm so excited about [Feature 1] because I'm working on [VUI] … this will be so meaningful and important as we start implementation!" — Lead UXD on G7 App
"7 weeks, you've covered a lot of ground. [Features] are very promising … in terms of Android and iOS, but with receivers as well. This is definitely going to be valuable across platforms." — UX Director
Impact
As of February 2025, the Dexcom G7 app has implemented a Siri command to read glucose trends. This feature directly addresses the pain point we identified regarding the lack of complex data reading, especially with visual information. By providing audio descriptions like "slowly rising" or "rapidly falling," the update benefits all users, particularly those with low vision, and demonstrates the impact of our team's VUI recommendations ❤️.