The human side of Data

Transforming fragmented employee survey workflows into unified data intelligence that enables both strategic overview and detailed analysis in a single, intuitive interface.

March 15, 2025

This case study showcases: Data visualization design • User testing & iteration • B2B UX • Collaborative design • Complex interaction design

Role: UX Designer

Timeline: 7 months

Team: 3 developers, 2 Data scientists

Why were data analysts constantly switching between tools?

Data analysts and human resources teams faced a fractured workflow: constant context-switching between dashboards, spreadsheets, and presentations. Each transition risked losing valuable insights, while fragmented tools created isolated data silos that prevented discovery of deeper employee experience patterns.

Working with a highly technical team, we embraced collaborative design, making data analysts active participants in shaping the solution making someone's job easier.

I joined from the very beginning and helped shape the product direction — from defining user flows and functionality to prioritizing what mattered most.

px-tools

What do employees really need from their data?

The analysis crystallized three core jobs-to-be-done, each representing a critical analysis phase in transforming raw employee data into meaningful narratives about our people's journey.

px-jtbd

What if visualization could be navigation?

Breaking from conventional dashboard thinking, we envisioned a system where visualization became the primary means of navigation. Key conceptual breakthroughs emerged:

  • Graph visualization:

    Translating analysts' natural whiteboard mapping behavior into digital experience

  • Drawer system:

    Addressing the need to maintain context across multiple data views

  • Contextual navigation:

    Seamless drilling through organizational layers

px-concept

How do we know it actually works for users?

We implemented a two-phase testing strategy to validate both foundational assumptions and real-world task completion.

What did stakeholders really understand?

Tested terminology, data interpretation, and role-specific needs with dashboard creators.

Before

After

"View type" - caused confusion about functionality

"Calculation method" - matched user mental models

"Compare results against" - too long

"Analyse by" - clearer for organizational filters

Could users actually complete their real tasks?

Scenario-driven testing with data consumers revealed critical interaction issues.

Metric

Result

Insight

Task completion rate

78%

Consistent across all users

Graph interpretation failure

100%

Network graphs didn't match mental models

Team comparison struggles

75%

Interaction pattern needed redesign

When users fail, what does that teach us?

Time visualization failure: Users took time to interpret discrete time-series charts. With real data showing minimal variance, we completely redesigned with bars chart.

Graph vs. hierarchy: 100% struggling rate on understanding network graph tasks led to adding hierarchical tree view - users needed familiar organizational chart structure alongside innovative network visualization.

Comparison workflow: Redesigned multi-selection mechanism with explicit comparison mode and side-by-side panels.

What does success look like in the real world?

Our final solution evolved into an interactive platform featuring:

  • Ring organization view: Immediate insight into team health

  • Contextual drilling: Seamless navigation through organizational layers

  • Dual visualization modes: Network graphs for pattern discovery, hierarchical trees for familiar navigation

  • Integrated comparisons: Side-by-side analysis eliminating tool-switching

  • Historical context: Trends integrated directly into team views

realword

Did it actually solve the problem?

The tool launched with immediate adoption success and positive workshop feedback.

Metric

Result

Active users in month 1

200+

Pre-launch demand

Users actively asking for release dates

"Pretty easy to find the results asked" - User testing participant

What would I do differently next time?

The most effective decisions came from letting go of cleverness and focusing on clarity.

It wasn’t always the elegant solution that worked — it was the one people understood.

Because if no one can use it, it doesn’t matter how clever it is.

Related articles

© Gitmel Gutierrez / Made with ❤️ in Next / 2023