Research Context

LibreHealth r-ADAI will assess the gains of AI & Radiologist co-learning. The results from this research will prove the benefits of symbiotic relationship between AI & Human radiologists by focusing on the critical key performance metrics (KPI)

My Roles
UX Researcher, Design Lead,
Functional Mentor &
Research Assistant
Dec 2019 to Mar 2020
(Open-Source Platform)

Team Members

Responsibilities in Brief

Research Paper

Highlights from User Research

I familiarized myself with  radiology department's terminologies by watching interventional radiologist's typical workday and reading related research articles. I felt understanding the domain is a must for asking the right question and empathizing with the user.

Taking pictures in radiology reading room was prohibited. Below are my findings based on the interview with radiologist and observation. Below are the two hospitals I researched with.

Insights from Semi-Structured Interview
  • Uses DICOM viewers for diagnosing study-list (Images)
  • Work collaboratively most of the time but seeks solitary while dictating
  • Prefers to have a references for terminologies. Currently, uses Radiopedia, DORIS and looks up for terminologies
  • Time-sensitive emergency studies need extra attention and highly stressful
  • Wishes to have patient history loaded in priority order
Insights from Observation
  • Works in high contrast environment
  • Multi-tasking (Phone call with physician, reading image,  taking notes, answering trainees)
  • Switches to multiple screens to complete the diagnosing process
  • Frequently repeats a sentence to the dictation device and ends up correcting manually
  • Keen focus on zooming in to go through series of images

Visualizing the breakdowns via Sequence Model

Empathizing with the user

I jotted down my learnings from the user research into empathy map and pictured the ideal thoughts of a fictional radiologist, Judy using a Persona.

Identifying themes

To organize the insights and lay a foundation for exploring ideas, I did affinity mapping exercise and identified pain points of radiologists.

Highlights from Secondary Research

I deep dived into research articles to understand the widespread pain points of radiologists. This step uncovered the magnitude of the error rate in diagnosing. I learned that confronting our mistakes and finding solutions could equip patients with a better care.
4% of error rate per year for 1 billion radiographic examinations leading to 40 Million diagnostic errors.

Understanding IHE (Integrated Healthcare Enterprise) Workflow

Familiarizing with DICOM Viewers

To understand about the current tools used by radiologists, I took a close look at different open sourced DICOM viewers and existing capabilities of these tools in assisting radiologists. Common features that I observed
  • Export and import options for study lists (Useful to transfer the studies across hospitals)
  • Flexibility to annotate and edit radiographs in the UI / presentation layer (Size, shape, color, transform)
  • Protocols to customize the layout / appearance of image series in the view port.


For the early research, the solution should

Idea Exploration

From the research insights, identified five areas where AI could turn as augmented intelligence to radiologists.

Gaining Different Perspectives

I wanted to seek feedback from experts in each field. We gathered a meeting with Interventional radiologist, informatics expert, ML Engineers from our team and presented ideas to them. They welcomed ideas supported by the research, and Engineers assessed the feasibility of the idea. Based on this internal discussion with experts, we chose to proceed with Idea 1 and decided to add other ideas in the future. With this scope, we posted the idea in the LibreHealth Google Summer of Code forum and recruited open source student developers internationally to work on the project.


Pre-visualizing the idea in scenarios

Exploring interactions through Co-design Session

Through co-design session with fellow designers, sketched different interactions for AI intervention that enables radiologist collaboration seamlessly. In our discussion, we considered AI as another person who could learn patterns from radiologists and help identifying abnormalities in radiographs, so, we decided to have persona for AI.
Iteration 2
Based on the visibility and current DICOM standards, our radiologists approved Interaction 1. Post this session, I defined the interaction flow and sketched to see how this idea will work in OHIF Viewer.

Prototype Highlights

Model Selection

There are several AI models that could best identify abnormalities. The option of choosing an AI Model based on modality (CT, MRI) will be available in the top action bar. This option can be made default by radiology technician based on the facility and type of diagnosis.

Auto Bounding Box

AI model draws a bounding box to highlight the abnormalities in the radiograph and measures it. The options to edit boundary will let the radiologist to make final decision and further edit the AI's bounding box.

Developer Handoff & Mentoring

I collaborated with a technical mentor and developers at Google Summer of Code' 20 as functional mentor. I presented the prototype and next steps to build the chosen feature. Below is the standalone workflow built by a developer last summer. More info can be found in the github link here

Next Steps & Ongoing Research


Leading a design project from scratch in a small cross-functional team setting taught me both technical and interpersonal skills. Exploring radiology was new to me but I learned to communicate the ideas, functional requirements to both technical experts and non-tech users by preparing well in advance. My developer hat helped me to communicate technical terms with confidence. Though working with constraints and making sense of the existing ecosystem seems challenging, interestingly every field expert were supportive in helping me understand technical intricacies. This helped me to translate ideas into a working product.

Let's Connect
Designed & </> by GP © 2021