← All posts

The First 90 Days in a Data Leadership Role: What Actually Matters

Ryan Desmond Ryan Desmond
data architecturedatafounderRDMIS

The following is adapted from Part One of The Data Leader’s Handbook, available at store.rdm.is.


The instinct in a new data leadership role is to build something. A dashboard that proves the function’s value. A governance framework that signals seriousness. A roadmap that shows the organization what is possible.

Resist this instinct for the first 90 days.

The first 90 days are not about building. They are about understanding. Specifically: understanding what exists, who actually owns it, what it would cost to fix what is broken, and which of those broken things the organization is actually willing to pay to fix.

These are different questions. Most new data leaders answer the first and skip the rest.

What exists

Auditing the current state is not glamorous work. It involves reading documentation that is out of date, running queries against systems whose logic was written by people who no longer work at the company, and sitting in meetings where the same dataset is described differently by every person in the room.

Do it anyway. Do not rely on what you are told. Trust what you can verify.

The categories to audit:

Data sources. What systems generate data that the organization relies on? This includes the obvious ones — ERP, CRM, financial systems — and the less obvious ones: the spreadsheet that the VP of Operations maintains manually because the ERP does not track what she needs, the access database that feeds the weekly board report, the CSV export that someone runs every Monday and emails to three people.

Data consumers. Who uses data to make decisions, and what data are they using? This is different from who uses the BI platform. Plenty of decision-makers never open a dashboard. They get a number from someone who opened a dashboard, or from a report that was built two years ago and has been running on autopilot since.

Data ownership. Who is accountable when a number is wrong? In most organizations, this question produces silence or multiple conflicting answers. Both responses are diagnostic. Silence means nobody has thought about it. Multiple answers mean the organization has a governance problem it has not named yet.

Data quality. Not in the abstract. Specifically: which reports do people trust, and which do they verify before acting on? The ones they verify are the ones with known quality problems. The verification behavior is the organization’s informal quality control. It is also a significant source of wasted time.

Who actually owns it

Formal ownership and operational ownership are frequently different. The data governance policy may name a data steward for the customer record. The person who actually decides what goes into the customer record, what counts as a duplicate, and what happens when a field is blank — that person may be a data analyst three levels below the named steward, operating from institutional knowledge that was never documented.

Identifying the actual owners — not the formal ones — is one of the most valuable things a new data leader can do in the first 90 days. These are the people the data function cannot afford to lose. They are also, frequently, the people most resistant to governance initiatives, because governance formalizes what they currently control informally.

This is not a problem to solve in 90 days. It is a problem to understand in 90 days.

What it would cost to fix

Every data function inherits technical debt. The question is not whether the debt exists — it does — but whether the organization is willing to pay to retire it.

Cost has two components. The first is direct: engineering time, platform costs, migration risk. The second is organizational: the disruption to existing workflows, the retraining required, the political capital spent convincing stakeholders that the change is worth the pain.

New data leaders consistently underestimate the second component. A migration that is technically straightforward can be organizationally brutal if the systems being replaced have been customized to match the mental models of the people using them.

Map both components before making any recommendations. A 90-day assessment that produces a technically correct roadmap that the organization will not execute is worth less than a less ambitious roadmap that will actually get funded.

What the organization is willing to pay to fix

This is the question most data leaders avoid because the answer is often uncomfortable.

The organization will tell you everything is a priority. Push past this. Ask which data problems are costing money right now — not theoretically, not in a future state, but in the current quarter. Ask which decisions are being made with data the decision-makers know is bad. Ask what the last failed data initiative was and why it failed.

The answers to these questions tell you where the organization’s actual risk tolerance is. An organization that failed a data initiative because it ran over budget will not fund another large data initiative, regardless of how well you present the business case. An organization that lost a client because of a data quality failure will fund data quality work immediately.

Match your roadmap to the organization’s actual appetite, not its stated priorities. The roadmap that gets executed is the one that builds the function. The roadmap that is technically correct and organizationally tone-deaf becomes a document in a shared drive that nobody reads.


The Data Leader’s Handbook covers the full 90-day framework across three chapters, including templates for the current-state audit, the stakeholder mapping exercise, and the roadmap presentation. Available at store.rdm.is for $69.

Subscribe via RSS

Paste this URL into any RSS reader (Feedly, Reeder, etc.)

https://rdm.is/blog/rss.xml

Subscribe

Weekly writing on procurement intelligence and data architecture.