
Redesigning a National Catch Data System

8
4
100%
1
User interviews across 4 regions
Usability testing sessions
Task success in usability testing
National system redesigned
(ECCI → CDM)
Context
DFO needed to replace ECCI, a legacy intranet system used to review, correct, and audit electronic logbooks (ELOGs). The system was outdated, inconsistent across regions, and no longer capable of supporting the growing national demand for digital logbooks.
The challenge: design a new platform — Catch Data Management (CDM) — that supports efficient workflows for hundreds of users across different regions, each with distinct operational needs and mental models.
Research Goal
Identify user frustrations, pain points, and unmet needs in ECCI, and validate whether the new CDM prototype improved efficiency, clarity, and accuracy in users’ daily tasks.
What I Did
1. User Interviews — 8 participants, 4 regions
I led interviews with Statistics Officers, CODE staff, and Resource Managers to understand real workflows, mental models, and regional differences. The 8 participants came from Newfoundland, Gulf, NCR, and Quebec — chosen deliberately to surface consistent patterns while capturing meaningful regional variation. (Note: the Maritime region and Conservation and Protection group were not represented due to a lack of response during recruitment.)
Five key insights emerged:
KEY FINDINGS
-
Manual, inefficient logbook search with no filters by region or date — users had to navigate between two separate screens to complete a single task
-
Confusion between regions and administrative areas — a core mental model mismatch that caused incorrect repatriations
-
No error prevention on submitted ELOGs, and inability to edit blank fields — forcing workarounds that varied by region
-
No way to track previous modifications — critical for audit workflows and a compliance risk
-
High unmet need for reporting features, and inconsistent units of measurement (degrees vs. decimal) creating reconciliation overhead
2. Usability testing — 4 participants, 3 regions
I planned and co-facilitated usability tests across 4 participants from Newfoundland (2), Quebec (1), and Gulf (1). Sessions tested three specific tasks: (1) search for a logbook, (2) edit the quantity of bait used, and (3) search for the number of logbooks submitted by a fish harvester on a specific date range.
The most critical finding: the absence of modification history wasn't just a usability issue — it was a compliance risk. This became a non-negotiable requirement for CDM.
TASK SUCCESS RATE
100%
ERRORS OBSERVED
0 errors on task 1; 1 error each on tasks 2 and 3
HIGHEST PRAISE
Automated "modified by" + timestamp — directly addressed the audit gap from interviews
KEY OPPORTUNITIES
Search/Save at top & bottom; clarify auto-filled fields; cross-region search; performance at scale
Before & After
My research findings directly informed the redesign of ECCI into CDM. Below is a side-by-side comparison of the two systems — the same core workflow, redesigned based on what users told us wasn't working.
BEFORE — ECCI (LEGACY SYSTEM)

Error queue: dense, text-heavy, no clear search filters. Users had to navigate between two separate screens (XML Correction and DB Consultation) to complete a single task.
AFTER — CDM PROTOTYPE

Unified view with clearer search parameters, region selector, and structured table. All actions accessible from a single screen.
Design by Adele Strub, informed by research findings. Prototype built in Figma using the GC Design System.
Impact
The research directly informed the redesign of CDM — a system that will serve hundreds of users across Canada's fisheries management regions.
01. Faster, more intuitive workflows — redesigned search and navigation eliminated the manual workarounds that had become normalized across regions.
02. Reduced regional inconsistency — surfacing how teams had adapted to ECCI's gaps allowed the design team to build a system that genuinely fits national use.
03. Improved data accuracy — error prevention and a standardized unit system addressed the root causes of logbook correction overhead.
04. Audit compliance by design — the "modified by" + timestamp feature, born from the traceability gap, became one of the most appreciated features in testing.
05. Stronger national rollout foundation — research findings grounded key design decisions in evidence, giving teams confidence going into scaled deployment.
Deliverables
Here's what I created along the way (all NDA-compliant):
My Contribution
One of the biggest challenges in this project was recruiting participants across different regions. Despite outreach efforts, some regions never responded — which meant the Maritime region and the Conservation and Protection group were not represented in the interviews. That was a real limitation of the research, and I documented it transparently in the final report.
This experience taught me that in a national, multi-regional project, recruitment is research. Whom you can reach — and whom you can’t — shapes what you learn and what gaps remain. Acknowledging that honesty is part of doing rigorous research.
Through this project, I led end-to-end research in a complex government environment — from interview planning and recruitment through usability and accessibility testing, analysis, and a findings presentation to the Catch and Effort team.