Hospital Discharge Support Tool

Validating an automated post-discharge survey system that reduced clinician burden and kept patients from falling through the cracks after leaving the hospital.

Healthcare

3 Weeks

UX Researcher

UX Designer, UX Team Lead


Context

The existing post-discharge process relied on clinicians making follow-up phone calls to patients after hospital stays. These calls were meant to answer questions and reduce readmission risk.

The organization proposed replacing or supplementing phone calls with a short survey sent directly to patients. The UX team created two design concepts and I led research to evaluate them with users before committing to development.


The Problem

The Gap Between Discharge and Recovery

The existing follow-up process had four compounding failure points that the new solution needed to address.

Painpoint 01
Time-Consuming for Clinicians
Clinicians were spending significant time making follow-up calls, pulling them away from other responsibilities.
Painpoint 02
50% of Calls Go Unanswered
Approximately half of outbound calls went unanswered — often because clinicians didn't know when patients were available.
Painpoint 03
No Pre-Call Context
Clinicians entered calls blind, with no information about how the patient was doing. This made it hard to address concerns in real time and reduced the quality of each interaction.
Painpoint 04
Risk to Patient Outcomes
These gaps contributed to lower patient satisfaction scores and elevated readmission risk.

Desired Outcomes

😊
Higher Patient SatisfactionPatients feel heard and supported after discharge, not left to navigate recovery alone.
📱
More Answered Follow-UpsDigital surveys meet patients where they are and at a time that works for them.
⏱️
Reduced Clinician BurdenTime is freed up for staff members to balance their other responsibilities.
🏥
Lower Readmission RiskEarlier identification of patient concerns allows targeted help.

Research Approach

Evaluating Two Survey Formats

Participants
14
Timeline
3 weeks
Method
Unmoderated concept testing

I ran unmoderated concept testing via Userbrain with 14 participants. Each participant interacted with one concept and answered questions about clarity, ease of use, and overall impression. I developed the research plan, set up and launched the tests, analyzed all sessions, and synthesized the findings into actionable recommendations for the design team.

Concept A
Add concept A image
Concept A
Users fill out the form and are prompted to provide any questions for their care team at the end.
Concept B
Add concept B image
Concept B
Users are prompted to provide any questions based on their response and on each screen.

Key Insights

What the Research Revealed

1

Concept B Performed Best

In terms of ease of use, Concept B outperformed Concept A. Participants liked how they were prompted with follow-ups after each question response.

2

Short Surveys Prevent Fatigue

Participants strongly preferred surveys of 3–5 questions. The proposed 3-question format hit the right balance — short enough to complete without wanting to quit, long enough to provide sufficient information.

3

Prompt Users for More Information

Several participants wanted to be able to provide more information about their health condition in order to flag a faster response from their care team.

4

Patients Need to Know What Happens Next

Users wanted to know when they'd hear back and from whom. Without this, the survey felt like sending a message into a void. A clear confirmation state explaining next steps was a high-priority need.

As someone who's had a recent surgery, I would have appreciated something like this.

— Research participant

It's very intuitive and easy to use. I wish my doctor had something like this.

— Research participant


Next Steps and Outcomes

From Findings to Rollout

Research delivered a clear path forward. After design and copy changes were made and handed off to a developer to build, the project team outlined next steps to test the survey.

1

Pilot in Two Inpatient Units

2

Implement in One Hospital

3

Scale System-Wide

30%
Survey completion rate in the first two weeks
Compared to a historic 15–20% response rate.

Learnings

How Research Made the Case for Development

Because this wasn't tied to an active product team, we began without dedicated engineering support and needed to prove the concept's value through design and research alone. Unmoderated testing allowed us to quickly gather real‑world feedback, validate the idea, and give stakeholders the evidence they needed to advocate for next steps.


Next Case Study

AI‑Powered Document Tool

Understanding how employees expect to find and learn how to use an internal AI tool for summarizing documents.

View Case Study →