Analytics hub

When I first joined, finding the right dataset felt like chasing shadows.

Finance teams had their own SharePoint folders, Operations relied on hidden Excel files, and Fleet analysts rebuilt the same KPIs from scratch because they didn’t know they already existed elsewhere. The result? Analysts could spend half their time searching, validating, or recreating data instead of driving insights. Trust was low, duplication was high, and collaboration almost non-existent.

The goal was clear: unify access, reduce duplication, and build trust in data through a shared ecosystem.

STEP 1

Discovery - Listening, mapping and comparing

Workshops uncovered a common frustration: fragmented access. Each domain voiced unique needs — speed, accuracy, standardization, automation.

Each domain had its own style, KPIs were named differently, and makeshift templates were rarely reused. For users, moving from one report to another felt like switching languages — confusion, low adoption, and wasted time.

Overloaded with colors and icons, the dashboard lacks consistency and structure, making data hard to read and compare.

% of misalignment issues detected by principle across audited dashboards.

% of misalignment issues detected by principle across audited dashboards.

"I just want to know which dataset is the official one."

Operations insight manager

With that, I designed the Data Marketplace — a knowledge hub where every asset had a home, an owner, and a clear purpose. In Figma, I shaped asset cards that carried not just a title but metadata that mattered: refresh cadence, glossary definitions, quality indicators.

In React, we built a portal where searching for “On-Time Performance” didn’t return a mess of versions but a single, curated dataset. Governance wasn’t an afterthought: to upload, teams had to define accountability, ensuring the marketplace scaled without chaos.

STEP 2

STEP 2

STEP 2

Key decisions & design

My role was to translate those pain points into clear design decisions, iterate with the team, and validate with users until we reached a trustworthy and scalable experience.

MILESTONE 1

Key Decisions

The first step was to define the principles that would guide the design. It wasn’t about growing fast, but growing with trust: governance before volume, one marketplace instead of four portals, metadata as part of the UX, and an architecture future-proofed for AI. Each decision directly responded to a pain point identified during discovery sessions.

Beyond structure and governance, we also envisioned a News section to highlight key events and trends impacting the commercial aviation industry, and a Tools section —a curated collection of reports and dashboards focused on optimizing routes, managing fleet performance, and uncovering market insights. These additions aimed to turn the marketplace into a living ecosystem of knowledge, not just a repository of data.

MILESTONE 2

Architecture and design iterations

With principles in place, I defined the site architecture, user flows, and initial wireframes. The focus was on validating the essentials: search, asset cards with metadata, and governance rules. Each iteration was a step toward refining usability and consistency.

MILESTONE 3

Usability testing & collaborative QA

We didn’t design in a vacuum. We tested prototypes with analysts from different domains, collecting real-time feedback. What stood out was not color or layout, but the value of metadata clarity. We simplified filters, adjusted interactions, and validated changes with data, development, and product teams through collaborative QA sessions

MILESTONE 4

Final version

The result was a Analytics hub that was usable, governed, and built to scale. Analysts could now find trusted datasets in minutes, managers operated with a shared language, and governance teams had visibility and control. What had been four fragmented portals became one ecosystem of reliable data.

STEP 3

STEP 3

STEP 3

STEP 3

The impact

From the fragmented dashboards I uncovered — overloaded visuals, inconsistent naming, and the absence of visual standards — I moved to a system with clear, easy-to-apply rules.

The impact was immediate. Analysts who once spent hours sifting through folders now found answers in minutes. In less than a quarter, search time dropped by nearly 45% and duplication of parallel dashboards fell significantly. Adoption spread quickly: more than two-thirds of analysts in Finance, Ops, and Fleet were using the hub regularly, reporting higher trust in the numbers and more collaboration across domains.

But the real shift was cultural. For the first time, data wasn’t something guarded by silos — it was a shared product, accessible and reusable across the enterprise. The marketplace became the backbone for what came next: integrating an AI assistant that could guide users directly to the right dataset, and expanding beyond dashboards into APIs, machine learning models, and knowledge articles. What started as a fix for “where do I find the data?” evolved into a scalable ecosystem of data products powering decision-making across the group.

Dashboard creation

4 days

vs 14 days

Reducing design and setup time meant dashboards could be produced 3x faster.


Adoption

60%

vs low adoption

Most teams now rely on templates and guidelines as their default starting point.


Onboarding

7 days

vs 30 days

New analysts adopted the design system in just one week, cutting the learning curve by 75%.

Dashboard creation

4 days

vs 14 days

Reducing design and setup time meant dashboards could be produced 3x faster.


Adoption

60%

vs low adoption

Most teams now rely on templates and guidelines as their default starting point.


Onboarding

7 days

vs 30 days

New analysts adopted the design system in just one week, cutting the learning curve by 75%.

Dashboard creation

4 days

vs 14 days

Reducing design and setup time meant dashboards could be produced 3x faster.

Adoption

60%

vs low adoption

Most teams now rely on templates and guidelines as their default starting point. i nsights.dfgdsg

Onboarding

7 days

vs 30 days

New analysts adopted the design system in just one week, cutting the learning curve by 75%. insights.dfgdsg

Dashboard creation

4 days

vs 14 days

Reducing design and setup time meant dashboards could be produced 3x faster.


Adoption

60%

vs low adoption

Most teams now rely on templates and guidelines as their default starting point.


Onboarding

7 days

vs 30 days

New analysts adopted the design system in just one week, cutting the learning curve by 75%.

What I learned

Analytics hub, was more than a portal — it was a cultural shift. Cross-domain workshops built trust, metadata clarity built confidence, and governance built consistency. The biggest learning: scaling data strategy is not about dashboards, but about creating a shared ecosystem where every domain trusts and reuses the same assets.

© 2025 Javier Mora

Case studies

About

Contact

Lab