A compassionate chatbot experience that helps individuals navigate complexity and choose a birth control that aligns with their personal goals and values.
Femtech Collaborative at the University of Pittsburgh
Our challenge was to design a mobile application that allows individuals of any gender to learn about, compare, and choose a contraception method.
The client, FemTech Collaborative, has been developing a suite of tools focused on helping people clarify their reproductive health goals,
navigate decision-making, and communicate with their providers. We aimed to augment their existing suite of tools with a mobile experience.
A mobile chatbot that helps users navigate complex decisions on reproductive health through compassionate conversation. Users can clarify personal values, learn about birth control methods, and receive personalized recommendations in an accessible, values-driven way.
Ask Shirley is designed with three fundamental values in mind: compassion, trust, and empowerment.
Kind and noninvasive. Makes no assumptions about a users' gender or values.
Private and secure. Transparent about abilities, strengths, and limitations.
Users decide when and how to share data, make decisions, and access care.
We began by researching existing digital healthcare experiences, including Planned Parenthood and Bedsider. We found that existing tools to be overly clinical and prescriptive, even though goals around reproductive health aren’t always black and white. Furthermore, resources designed for women don't address the diversity of identities and needs of those who may use contraception for reasons beyond pregnancy prevention.
Seeking something less prescriptive, conversational AI stood out as a potential tool for navigating complex, personal decisions in a user-empowered way.
We looked into two existing resources that use chatbot technology for individual healthcare needs. Both chatbots take on clearly non-human personas to make asking for and receiving help feel less judgmental. Both use earnest tones and are extremely transparent about their abilities and limitations as AI.
"Roo" from Planned Parenthood, a sexual health education chatbot for teenagers.
"Woebot," a Cognitive Behavioral Therapy chatbot providing therapeutic tools.
We wanted to understand who would benefit the most from a conversational tool, and what their needs, values, and personal goals around reproduction might be. We created profiles to empathize with a diversity of potential users.
A personable, accessible chatbot can have wide application but be especially supportive to people who don't have access to traditional healthcare due to socioeconomics, social stigma, or time.
How might we empower users to access contraceptive care on their own terms?
We conducted user testing with a total of eight users; two participated in quantitative research collection and six provided preferences through informal surveying. We sought to answer the following questions:
What values do our users hold?
What type of chatbot would users enjoy interacting with?
What is the most intuitive way to navigate through the app?
We originally assumed users would prefer to have sensitive conversations with a clearly non-human persona. In testing, we found that users across genders prefer to interact with an image of a woman doctor. We redesigned the chat persona, including the icon and tone of voice, to reflect a warm, direct woman doctor.
Initially, users could navigate between two major sections of the app: a tab to freely explore and compare methods, and the chatbot. We found that having both at the same hierarchy was confusing for users and distracted from the chatbot as the central service. We redesigned the app to center experience on chat.
Ask Shirley's visual style is warm and professional. Shirley, our chatbot doctor, is named after Shirley Cholsom — the first Black woman in Congress and powerful advocate for womens reproductive rights.
Private, compassionate, and nonjudgemental. Ask Shirley is a conversational health experience for individuals exploring contraception.
The opening sequence, an earth-toned screen featuring a Ginko leaf, grounds the user and sets the stage for a safe, warm experience. Ginko trees are the only plant possessing both male and female reproductive parts — a nod to the app's commitment to gender inclusivity.
Unlike other medical experiences, users have control over when and how they share their data. Data gathered through the onboarding process provides context for the AI to create more personalized and contextualized guidance, but users can use the app with or without providing personal data.
The user drives the conversation. They can toggle on/off incognito mode depending on their privacy preferences. There are several ways to begin chat:
1. Select a suggested pathway
Receive personalized recommendations
Prepare for a doctor's appointment
2. Type in a question
3. Choose from commonly asked questions
For users who prefer to get oriented before jumping into chat, the All Methods tab enables independent exploration. Clicking into a method will display a short summary of key information; from there, users can choose to bring the method into chat for a conversation or compare it to another method.
User privacy is a key design principle for the app. Users can access password-protected historical artifacts, such as saved conversations, charts, and reports. Anchor points are automatically created at key conversational turning points and can also be manually created.
Domestic Violence Screening
Natural Language Processing
Break down input sentences to component parts
Respond based on understanding intent
Impact & Reflections
The client, FemTech Collaborative, was excited by the innovative exploration of applying AI to healthcare in a personal, values-driven way. They discussed applying aspects of the project to existing tools and were exploring the possibility of expanding into service design, with Ask Shirley acting as a "pocket advocate" in and out of the doctor's office.
We were also left with some bigger questions to tackle in the field of healthcare technology:
Do we trust AI to have sensitive conversations?
What does it mean to take human interaction out of healthcare?