Why Analysts Recreate Metrics: The Duplication Cycle Explained

Understand why analysts repeatedly recreate the same metrics rather than reusing existing definitions. Learn the root causes of metric duplication and how semantic layers break the cycle.

7 min read·

Analysts recreate metrics because the alternative - finding and reusing existing definitions - is harder than building from scratch. This counterintuitive reality drives massive duplication across organizations. The same revenue calculation exists in dozens of dashboards, each built independently by analysts who could not or would not reuse prior work.

Understanding why recreation happens reveals that it is not laziness or ignorance but a rational response to systemic barriers. Removing these barriers - not exhorting analysts to try harder - is what actually stops duplication.

The Recreation Reality

The Scale of Duplication

Consider a typical enterprise metric like "monthly active users":

  • Marketing has a version for campaign analysis
  • Product has a version for feature usage
  • Finance has a version for investor reporting
  • Customer success has a version for health scoring
  • Each regional team may have local variants

Five or more versions of a single metric is common. For an organization with 50 core metrics, this means 250 or more metric definitions - most duplicative.

The Recreation Pattern

The typical recreation cycle:

  1. Analyst needs a metric for their analysis
  2. Analyst searches briefly for existing definitions
  3. Search is unsuccessful or inconclusive
  4. Analyst concludes it is faster to build than to find
  5. Analyst creates new definition based on their understanding
  6. New definition joins the population of variants

Each recreation adds to the problem while feeling like an efficient choice to the individual analyst.

Why Recreation Happens

Discovery Failures

Analysts cannot use what they cannot find:

No central registry: Metric definitions scatter across BI tools, SQL files, documentation, and tribal knowledge. No single place shows what metrics exist.

Poor search: Searching for "revenue" might return thousands of results - dashboards, fields, tables - without indicating which are authoritative metric definitions.

Inconsistent naming: One analyst's "mrr" is another's "monthly_recurring_revenue" is another's "MRR_calc". Search by name misses synonyms.

Hidden in context: A revenue definition might exist as a calculated field named "metric_1" in a dashboard named "Q3 Analysis". Finding it requires knowing where to look.

Trust Deficits

Even when found, existing metrics may not be trusted:

Unknown provenance: Who created this metric? Are they authoritative? Is it current?

Uncertain correctness: Does this calculation actually match what the business intends? How would I verify?

Stale documentation: The description says one thing, but does the code still match? When was it last updated?

Past bad experiences: Analysts burned by incorrect existing metrics learn to distrust and recreate.

Modification Barriers

Existing metrics often almost - but not quite - meet needs:

Rigid definitions: The existing metric includes a filter the analyst does not want, or excludes something they need.

No extension mechanism: There is no way to use the existing metric as a base and modify it. Recreation is the only option for variations.

Permission issues: The analyst cannot modify the existing metric and cannot use it as-is.

Organizational Friction

Reuse requires cross-team coordination that recreation avoids:

Ownership ambiguity: No one is clearly responsible for maintaining reusable metrics.

Process overhead: Getting permission to use another team's metric may require meetings and approvals.

Different priorities: The owning team may not support the using team's needs or timeline.

Tool Limitations

BI tools make recreation easier than reuse:

Siloed metrics: A calculated field in Tableau is not accessible from Power BI.

Copy and modify: Tools make it trivial to copy dashboards (including their metrics) and modify them.

No sharing mechanism: There is no easy way to reference a metric definition across tools or even across reports within a tool.

The Costs of Recreation

Direct Costs

  • Analyst time spent rebuilding what exists
  • Review time for redundant definitions
  • Testing effort duplicated across versions
  • Maintenance burden multiplied

Consistency Costs

  • Divergent definitions produce conflicting numbers
  • Reconciliation time when numbers do not match
  • Trust erosion from inconsistent results
  • Decision delays from uncertainty

Opportunity Costs

  • Analysis not performed while recreating basics
  • Insights not generated from time spent on repetition
  • Innovation foregone for maintenance
  • Strategic work sacrificed for tactical duplication

Breaking the Recreation Cycle

Make Discovery Trivial

If finding existing metrics is easy, recreation becomes the harder path:

Central metric catalog: One searchable location for all governed metrics Rich metadata: Descriptions, owners, use cases, and relationships Multiple search paths: Find by name, business concept, data source, or owner Recommendations: "You searched for revenue - here are the 3 governed revenue metrics"

Build Trust

If existing metrics are trustworthy, analysts will use them:

Clear ownership: Every metric has an accountable owner Certification status: Explicit indication of governance level Validation evidence: Tests that prove correctness Currency indication: When was this last reviewed/updated?

Enable Modification

If existing metrics can be adapted, recreation becomes unnecessary:

Composable metrics: Use governed base metrics as building blocks Filter flexibility: Apply custom filters to standard metrics Extension points: Create variants that explicitly derive from governed bases Forking with connection: Custom versions that update when the base updates

Reduce Friction

If reuse is easier than recreation, behavior changes:

No permission needed: Self-service access to governed metrics Cross-tool access: Same metrics available regardless of tool choice Pre-built connections: Tools already configured to access metric layer Clear documentation: Context available at the moment of need

The Semantic Layer Solution

Semantic layers directly address the factors driving recreation:

Single Source of Truth

All governed metrics live in one place:

  • Central registry of definitions
  • Searchable by multiple attributes
  • Accessible from any tool
  • Clearly authoritative

Built-In Trust

Semantic layers embed trust signals:

  • Explicit governance status
  • Visible ownership
  • Version history
  • Test results

Flexible Consumption

Metrics can be used without recreation:

  • Query with custom filters
  • Combine metrics freely
  • Aggregate at any level
  • No modification of base definition required

Cross-Tool Availability

The same metric serves all consumers:

  • BI tools query semantic layer
  • SQL clients access semantic layer
  • AI systems receive semantic layer context
  • No tool-specific recreation needed

Implementation Approach

Identify Recreation Hotspots

Analyze where recreation is most common:

  • Which metrics have the most variants?
  • Which teams duplicate most frequently?
  • What types of analysis trigger recreation?

Prioritize High-Value Metrics

Focus semantic layer adoption on metrics where recreation costs are highest:

  • Widely used across organization
  • Frequently recreated
  • Consistency-critical for decisions
  • Compliance-relevant

Establish Governance

Create clear ownership and maintenance processes:

  • Assigned owners for each metric
  • Update and review schedules
  • Change request workflows
  • Deprecation procedures

Enable Easy Access

Ensure using governed metrics is frictionless:

  • Self-service catalog access
  • Pre-configured BI tool connections
  • Clear usage documentation
  • Support channels for questions

Monitor and Iterate

Track whether recreation is decreasing:

  • New metric definition frequency
  • Semantic layer usage rates
  • Cross-tool consistency metrics
  • Analyst time allocation

Cultural Shift

Breaking the recreation cycle requires both infrastructure and mindset change:

From: "I'll just build what I need" To: "Let me check what already exists"

From: "My metric for my analysis" To: "Governed metric with my filters"

From: "It's faster to rebuild" To: "It's faster to reuse"

This shift happens when infrastructure makes the new mindset genuinely true - when reuse actually is faster than recreation.

Measuring Success

Track indicators that recreation is declining:

  • New metric creation rate: Should stabilize as reuse increases
  • Semantic layer query volume: Should grow as adoption increases
  • Metric variant count: Should decrease through consolidation
  • Analyst time on recreation: Should decrease (survey or time tracking)

Improvement in these metrics indicates the recreation cycle is breaking.

Metric recreation is not an analyst discipline problem - it is an infrastructure problem. Analysts recreate because the systems they work with make recreation rational. Semantic layers change the equation by making discovery easy, building trust, enabling flexibility, and reducing friction. When reuse is genuinely easier than recreation, recreation stops - not because of policies, but because of reality.

Questions

Industry estimates suggest that 30-40% of analyst time is spent on work that has already been done elsewhere in the organization. Metric recreation is a significant component - analysts building revenue calculations, customer counts, and conversion rates that already exist in other reports.

Related