top of page

Improving Access to Disability Benefits in Canada

Canda logo.png

As a UX Researcher at Employment and Social Development Canada, I led research to understand how people with disabilities experience the CPP-D eligibility process — and where the system was failing them.

29

8

5

16

Semi-structured interviews conducted

Internal departments covered

Key insights surfaced

Weeks — 3 research phases

Context

The Canada Pension Plan Disability (CPP-D) program provides monthly financial support to eligible Canadians aged 18–65 with a severe and prolonged disability that prevents them from working consistently. Eligibility is assessed through a combination of medical conditions, functional limitations, and personal characteristics — but the statutory definition is broad and non-diagnosis-based.

​

Feedback from persons with disabilities and stakeholders indicated that accessing the program was cumbersome, time-consuming, and difficult to navigate — generating frustration, feelings of disempowerment, and barriers to accessing the support people were entitled to.

​

The Disability Adjudication Policy Division (DAPD) of ESDC engaged our team to apply Human-Centred Design methodology to understand how users experience the eligibility process — and to surface recommendations grounded in that evidence.

Research Question

​How might we improve the CPP-D eligibility process to reduce friction for clients and internal teams — while balancing legal requirements with real user needs?

What I Did

The research was structured in three phases over 16 weeks, using a mixed-method approach combining policy review, semi-structured interviews, affinity mapping, and system mapping.

​

Phase 1 — Discovery

​

I began with a policy and procedure review to establish a foundational understanding of how the program was designed to work. This grounded my interview questions and helped me identify where the gap between policy intent and lived experience was most likely to appear.

​

I then conducted three rounds of semi-structured interviews with a total of 29 participants:

​

ROUND 1 — 22 INTERNAL SMES

  • Policy and Partnerships, Medical Expertise, Medical Adjudicator Quality Assurance (MAQA), Processing Agents, Medical Adjudicators, Business Operations, Continuous Improvement, and Medical Advisor

  • Explored: policy implementation, workload distribution, communication between departments, functional capacity assessment, regional inconsistencies, and adjudication challenges

​

ROUND 2 — 5 CLIENTS

  • Participants with different types of disabilities across different regions of Canada (excluding Quebec)

  • Explored: the application experience, reaching out to physicians, and challenges encountered throughout the process

​

ROUND 3 — 2 EXTERNAL SMES

  • An academic and a physician, providing an outside-in perspective on the system's challenges

​​​​​​​​​​​

​

Phase 2 — Analysis

​

Interview data was transcribed and synthesized using affinity mapping in Miro — one map per department for internal SMEs, plus a separate map for clients. All participants were anonymized with identification codes. The affinity maps surfaced recurring patterns and tensions across groups.

​

From the affinity maps, I developed a System Map visualizing the full CPP-D process — from application through to benefit granted or appealed — and two Journey Maps capturing the emotional experience of clients: one whose application was approved on first submission, and one who was denied and later approved at reconsideration.

​

A key output of this phase was a pain points comparison table, which revealed where SME and client experiences overlapped — and where they diverged.

image.png

Journey Map 1 — client whose application was approved on first submission. Emotions move from confusion and overwhelm through anxiety and stress, resolving in relief — followed by renewed confusion about the benefit amount.

image.png

Journey Map 2 — client whose application was first denied, then approved at reconsideration. The emotional arc is significantly longer and harder, with frustration persisting even after approval.

​Phase 3 — Finding & Reporting

​

Insights were synthesized into five key findings and five corresponding HMW questions, which were presented to DAPD stakeholders alongside actionable recommendations.

Key Insights

​I. CLIENTS NEED HELP WITH THE APPLICATION PROCESS

  • The application form is lengthy and complex — most clients rely on family, friends, organizations, or lawyers to apply

  • Low awareness of the program means clients often don't know if they qualify before investing significant effort

​

II. INCOMPLETE MEDICAL REPORTS

  • Physicians frequently don't provide enough clinical detail for adjudicators to make a decision

  • This leads to more denials and more complex cases — often without clients understanding why

​

III. THE FUNCTIONAL CAPACITY SELF-ASSESSMENT IS BEING OVERLOOKED

  • Medical adjudicators base decisions approximately 95% on the medical report, giving little weight to the client's own account of their functional limitations

  • Clients feel unheard and punished by the process as a result

​

IV. INCOSISTENCIES IN ADJUDICATION

  • Medical adjudicators may interpret cases differently depending on their background and region — there is no standardized evaluation process

​

V. DISCONNECTED COMMUNICATION

  • Communication between departments is fragmented — changes to the process are not consistently communicated

  • Clients also feel this disconnect: "The left does not know what the right is doing." (Client 03)

​​​​​​​​​​​

​​One finding that surprised the team: we had assumed the application process itself was complex and confusing. The data showed the opposite — most clients understood the process well. The friction was in execution, not comprehension.

How Might We

Each insight was reframed as a design opportunity:

01.

How might we make clients feel assured that they are submitting an application form that they are satisfied with?

02.

How might we make clients feel more empowered throughout the process?

03.

How might we prompt physicians to provide more complete information about a patient's medical condition?

04.

How might we make the evaluation process more consistent for medical adjudicators across regions?

05.

How might we improve communication between departments to deliver updates more efficiently?

Recommendations

01. Integrate the CPP-D benefit toolkit into the application form — and redesign it to be interactive and accessible, using web accessibility standards to support people with disabilities.

​

02. Give greater weight to the functional capacity self-assessment in the adjudication process, so clients feel that their own account of their limitations is taken seriously.​

​​

03. Provide clear and specific guidelines to physicians detailing what information and documentation are required in the medical report to support a decision.

​​

04. Standardize training for medical adjudicators across regions to reduce the subjectivity and inconsistency currently present in the evaluation process.​​​

​

05. Encourage regular cross-department meetups to share experiences, discuss process changes, and reduce the communication fragmentation that affects both staff and clients.

Deliverables

Here's what I created along the way (all NDA-compliant):​​

My Contribution

I co-led this research with a fellow UX Researcher, working in direct collaboration with ESDC's Disability Adjudication Policy Division over 16 weeks. My responsibilities covered the full research process: planning interview guides, facilitating sessions, transcribing and synthesizing data through affinity mapping, developing the system and journey maps, and co-authoring the final findings report delivered to government stakeholders.

​

The most challenging aspect was conducting interviews with people with disabilities about a process that had already caused them real harm. That required care in how questions were framed, and honesty about what the research could and couldn't change in the short term.

​

The findings report was delivered to ESDC at the end of the project. As is common in government research, implementation decisions rest with the department — we didn't stay long enough to see what was acted on. But the work was grounded in evidence from 29 people who had rarely been asked how the system felt from the inside.

bottom of page