01 / 09
Research
This was a project where stakeholders represented users.
I didn't have direct access to end users. Internal restrictions meant I worked through three stakeholder groups who acted as proxies — each with different priorities and a different idea of what this platform should be.
🛡
Data Governance
The gatekeepers. Cared about control — who can access what, under what rules, and how it's documented. Biggest worry: a new platform means more process for them to own.
⚡
Data Analytics
Closest to the problem. Cared about speed — stop making me wait a week to find a file. Wanted self-service, fewer bottlenecks, less dependency.
⚙
Data Engineering
The builders. Cared about feasibility — whatever gets designed needs to work with existing infrastructure and be maintainable long-term.
02 / 09
What I learned
Three things kept coming up across every conversation.
Mental model mismatch
Users didn't search by file name — nobody types view__gold__str. They search by topic: "sourcing data," "open orders." The system was built around how data was stored, not how people think about it.
Convoluted request flow
Getting access to a dataset meant chasing people. Users had no visibility into who owned a file, what the approval process was, or how long it would take.
No preview before committing
Analysts need to know what metrics are in a dataset, at what level, before investing time. The current process didn't allow any evaluation before requesting access.
03 / 09
Scoping the MVP
Too many voices. I needed to figure out who called the final vote.
All three stakeholder groups gave inputs, reviewed work, and iterated — all at once, every cycle. Twice-a-week sessions full of competing priorities and personal preferences. Hard to tell what was a requirement and what was an opinion.
So I figured out who had the final call. Created a PRD that defined exactly what I would design and build. The stakeholders aligned on three pillars: discoverability, searchability, reliability.
1 persona
Data explorer — someone who needs data but doesn't know where to find it
2 use cases
Evaluate a dataset before requesting it · Request access through the app
1 champion team
Small group to test with. Everything else was phase 2.
04 / 09
Tradeoffs
Two decisions that had real costs.
Human-readable naming
I proposed display names ("Stock Transfer Requisition") over technical slugs (view__gold__str). Governance would need to create and maintain a readable name for every dataset — 230+ and growing. Real overhead on a stretched team. I used Claude to draft initial descriptions from column metadata, making the burden manageable instead of theoretical.
Small sample for testing
Narrowing to one persona and one champion team meant a small test group. The tradeoff was confidence — I couldn't claim broad validation. But it gave me real signal fast, and the alternative was designing in the dark for months.
05 / 09
The conflict
Stakeholders wanted the request flow outside the app. I didn't.
The governance team wanted access requests handled through existing tools — email, Teams, ServiceNow. They didn't want to own a new workflow inside a new product.
I mapped both journeys side by side. External flow: send an email, get redirected, file a ServiceNow ticket, wait, follow up. 7+ steps, days of waiting, zero visibility. In-app flow: click "Request Access," approval chain triggers, status updates in real time. 3 steps.
I didn't argue the principle. I showed the step count. That moved the conversation.
06 / 09
Lo-fi wireframes
Tested the flow before committing to any visual direction.
Lo-fi wireframe images
Mapped the full journey — search → dataset detail → request access. Focused on navigation patterns and information hierarchy. Tested with the champion team early to validate structure before spending time on pixels.
07 / 09
Hi-fi wireframes
Locked in the layout and information density.
Hi-fi wireframe images
Card-based browse with progressive disclosure. Tab structure for dataset details — metadata, glossary, quality scores, sample data. Every element on the card had to earn its place.
08 / 09
UI design
Designed to feel like a product, not an internal tool.
UI design screens
Consistent iconography across dataset types. Plain language throughout. Light and dark mode. WCAG-compliant. Built for 10,000+ employees — most of whom would never read a technical spec.
09 / 09
Final output
Data Ichiba — one place for all the data.
Final product showcase
230+ datasets across 6 domains. Searchable by topic, not file name. Quality indicators and governance built in. An access pipeline that replaced a week of emails with three clicks.