Opinion7 min read

The CMO who killed the dashboards — and saw revenue go up

This is a composite of six real conversations with heads of growth who disabled vanity dashboards and forced their teams to report only placement, reply rate, and pipeline. Every one of them tells the same story.

I spoke with six CMOs and heads of growth in late 2026 about an experiment they had all run, independently, in the preceding twelve months. The details differed; the shape was identical. They turned off the legacy email dashboards — opens, CTOR, industry benchmarks — and mandated that the team report only three numbers: inbox placement, reply rate, and sourced pipeline. Within a quarter, each of them saw revenue from the email channel rise.

This piece composites those conversations. No individual is identified; the quotes are paraphrased. The pattern, though, is real enough to be worth taking seriously.

TL;DR

Turning off vanity dashboards is a forcing function. Teams measured only on placement + replies + pipeline stop optimising for subject-line opens and start optimising for inbox share and reply generation. The numbers that matter go up.

Before: the typical dashboard stack

Every team started from roughly the same setup. A large Looker or Tableau board, fed by ESP export. Opens on the left, CTOR in the middle, “revenue attributed” on the right. Weekly reviews ran through each tile. Subject-line A/B tests dominated optimisation time. Industry benchmark comparisons crept into every presentation.

The symptoms were also consistent:

  • Revenue per send was flat or declining for 6–18 months
  • Campaigns with high opens did not correlate with campaigns that drove revenue
  • Team was spending 60%+ of its time on creative / subject-line iteration
  • Deliverability and list hygiene were “owned” by nobody in particular

The decision

The interventions were all top-down. One CMO disabled the Looker board with a literal access revoke. Another rewrote the marketing OKRs to reference only placement and pipeline. A head of growth told their team: “if you send me a report that leads with open rate, I will return it unread.”

The replacement was narrow: three numbers per campaign.

  1. Weighted inbox placement across the audience's provider mix
  2. Reply rate (or click-to-reply for newsletters)
  3. Sourced pipeline (B2B) or incremental revenue (B2C)

What happened in the first 90 days

The pattern was remarkable.

Week 1–2: panic

Teams found that their campaigns looked worse under the new metrics than they had under the old. “We thought we were a 28% open-rate operation; turned out 31% of our sends were landing in Gmail Promotions and another 12% in spam.” Morale dipped.

Week 3–6: triage

The teams stopped launching new campaigns and started fixing authentication, list hygiene, and send-time patterns. Unsubscribing inactive segments became urgent. DMARC went from p=none to p=quarantine. SPF records were audited.

Week 7–12: reallocation

Time freed up by not running subject-line A/B tests went into copy depth, segmentation, and genuinely different offers for different segments. Reply rates moved from 1–2% to 3–5% on outbound. Placement improved 10–25 points on Gmail.

End of quarter: revenue

In all six cases, email-sourced revenue was up quarter-over-quarter. Not by a huge factor — 12–34% — but reliably. Every CMO pointed to the same explanation: they had been optimising for a visible-but-broken metric (opens), and switching to a harder-to-game metric (placement + replies) forced the team to work on the things that actually mattered.

Why this worked (and why it's unpleasant)

Dashboards are organisational artefacts, not diagnostic tools. They define what is discussed in meetings. What is discussed in meetings is what gets worked on. What gets worked on is what moves. A dashboard full of vanity metrics produces a team that vanity-metric-optimises.

The unpleasant part is that placement and reply rate are harder to move than opens. You cannot goose them by testing emoji in subject lines. You have to fix lists, authentication, content quality, and targeting. The work is longer, less visible, and more boring.

Start with a placement benchmark

If you are thinking about this experiment, the first step is a baseline placement number. Run Inbox Check against your next campaign; record the result; use it as week-one input for the new dashboard. Free, no signup.

The standard objections

“But opens are still useful as a diagnostic.”

They are. Nobody is saying delete the raw data. The argument is about what counts as a KPI — the thing you report up, the thing that defines success. Keep opens in the warehouse; stop promoting them to the deck.

“Reply rate doesn't apply to broadcast newsletters.”

Correct. For newsletters, use click-through to a known-valuable page plus read-through depth (engagement-seconds). The principle holds: measure the behaviour you want, not the opt-out of spam filtering.

“The CEO still wants to see opens.”

Then that is a different project. But CEOs notice when revenue goes up; they rarely ask why after the fact.

FAQ

Is this a real story or a composite?

Composite of six real conversations, anonymised. The outcome pattern is consistent enough across them that we felt comfortable presenting it as a narrative.

How long did it take to see revenue impact?

One quarter was the common answer. Shorter for teams that already had decent deliverability; longer for ones that had to rebuild reputation from a bad starting point.

Did anyone on the team quit?

In one case, yes — a senior specialist whose identity was bound to subject-line optimisation. In other cases, the team rallied around the new framing. Your mileage may vary.

Does this work for purely transactional email?

Yes, with a different metric set: placement + read rate + conversion-by-downstream-event (e.g., password reset actually completed). The principle is the same: stop reporting the corrupted metric.
Related reading

Check your deliverability across 20+ providers

Gmail, Outlook, Yahoo, Mail.ru, Yandex, GMX, ProtonMail and more. Real inbox screenshots, SPF/DKIM/DMARC, spam engine verdicts. Free, no signup.

Run Free Test →

Unlimited tests · 20+ seed mailboxes · Live results · No account required