I spoke with six CMOs and heads of growth in late 2026 about an experiment they had all run, independently, in the preceding twelve months. The details differed; the shape was identical. They turned off the legacy email dashboards — opens, CTOR, industry benchmarks — and mandated that the team report only three numbers: inbox placement, reply rate, and sourced pipeline. Within a quarter, each of them saw revenue from the email channel rise.
This piece composites those conversations. No individual is identified; the quotes are paraphrased. The pattern, though, is real enough to be worth taking seriously.
Turning off vanity dashboards is a forcing function. Teams measured only on placement + replies + pipeline stop optimising for subject-line opens and start optimising for inbox share and reply generation. The numbers that matter go up.
Before: the typical dashboard stack
Every team started from roughly the same setup. A large Looker or Tableau board, fed by ESP export. Opens on the left, CTOR in the middle, “revenue attributed” on the right. Weekly reviews ran through each tile. Subject-line A/B tests dominated optimisation time. Industry benchmark comparisons crept into every presentation.
The symptoms were also consistent:
- Revenue per send was flat or declining for 6–18 months
- Campaigns with high opens did not correlate with campaigns that drove revenue
- Team was spending 60%+ of its time on creative / subject-line iteration
- Deliverability and list hygiene were “owned” by nobody in particular
The decision
The interventions were all top-down. One CMO disabled the Looker board with a literal access revoke. Another rewrote the marketing OKRs to reference only placement and pipeline. A head of growth told their team: “if you send me a report that leads with open rate, I will return it unread.”
The replacement was narrow: three numbers per campaign.
- Weighted inbox placement across the audience's provider mix
- Reply rate (or click-to-reply for newsletters)
- Sourced pipeline (B2B) or incremental revenue (B2C)
What happened in the first 90 days
The pattern was remarkable.
Week 1–2: panic
Teams found that their campaigns looked worse under the new metrics than they had under the old. “We thought we were a 28% open-rate operation; turned out 31% of our sends were landing in Gmail Promotions and another 12% in spam.” Morale dipped.
Week 3–6: triage
The teams stopped launching new campaigns and started fixing authentication, list hygiene, and send-time patterns. Unsubscribing inactive segments became urgent. DMARC went from p=none to p=quarantine. SPF records were audited.
Week 7–12: reallocation
Time freed up by not running subject-line A/B tests went into copy depth, segmentation, and genuinely different offers for different segments. Reply rates moved from 1–2% to 3–5% on outbound. Placement improved 10–25 points on Gmail.
End of quarter: revenue
In all six cases, email-sourced revenue was up quarter-over-quarter. Not by a huge factor — 12–34% — but reliably. Every CMO pointed to the same explanation: they had been optimising for a visible-but-broken metric (opens), and switching to a harder-to-game metric (placement + replies) forced the team to work on the things that actually mattered.
Why this worked (and why it's unpleasant)
Dashboards are organisational artefacts, not diagnostic tools. They define what is discussed in meetings. What is discussed in meetings is what gets worked on. What gets worked on is what moves. A dashboard full of vanity metrics produces a team that vanity-metric-optimises.
The unpleasant part is that placement and reply rate are harder to move than opens. You cannot goose them by testing emoji in subject lines. You have to fix lists, authentication, content quality, and targeting. The work is longer, less visible, and more boring.
If you are thinking about this experiment, the first step is a baseline placement number. Run Inbox Check against your next campaign; record the result; use it as week-one input for the new dashboard. Free, no signup.
The standard objections
“But opens are still useful as a diagnostic.”
They are. Nobody is saying delete the raw data. The argument is about what counts as a KPI — the thing you report up, the thing that defines success. Keep opens in the warehouse; stop promoting them to the deck.
“Reply rate doesn't apply to broadcast newsletters.”
Correct. For newsletters, use click-through to a known-valuable page plus read-through depth (engagement-seconds). The principle holds: measure the behaviour you want, not the opt-out of spam filtering.
“The CEO still wants to see opens.”
Then that is a different project. But CEOs notice when revenue goes up; they rarely ask why after the fact.