Cold Email9 min read

"95% delivered" — and still silence

Every cold-outreach dashboard reports a delivery rate north of 90%. Every founder running cold outreach reports crickets. The two facts are compatible, because "delivered" does not mean "read by a human." Here is the gap.

You log into Instantly, Smartlead, Lemlist, Apollo, or whatever you're using this quarter. The dashboard is green. 95% delivered. Maybe 97%. Bounce rate looks healthy. Sender reputation is "warm." Everything is fine — except nobody replied, nobody booked a call, and pipeline is flat.

The dashboard is not lying. It is answering a different question from the one you think you asked. The word "delivered" in outreach-tool vocabulary means the receiving SMTP server returned a 2xx response. That's it. It does not mean the message reached a human inbox. It does not mean it was read. It does not even mean it was shown to the recipient's mail client.

What "delivered" actually means

SMTP is a two-party protocol between servers. When your outreach tool says 95% delivered, it means 95% of your messages were accepted by the next-hop server — usually the recipient's mail provider. After that handoff, everything interesting happens inside a black box your tool cannot see.

Inside that black box, the message can land in any of these places:

  • Primary inbox (best case)
  • Promotions or Updates tab (Gmail)
  • Junk or Spam folder
  • Quarantine (enterprise tenants)
  • Silent rejection after acceptance — the message is accepted, then dropped
  • Rate-limit deferral that eventually times out

All six outcomes count as "delivered" in your dashboard. Only the first produces replies.

The hidden 60%

In our own testing across 500+ cold-outreach domains, the average gap between "delivered" and "inbox" was 40 to 60 percentage points. A campaign reporting 95% delivered commonly placed 35–55% in the primary inbox at Gmail and Outlook. The remainder landed in Promotions, Junk, or the void.

That gap is the real failure rate, and it is invisible in every major outreach dashboard. Not because vendors are hiding it — they genuinely don't have the data. Gmail and Outlook don't tell senders which folder a message landed in. The only way to know is to seed the campaign: send to test mailboxes you control, then inspect each one.

See where your mail actually lands

Inbox Check seeds your campaign to 20+ real mailboxes across Gmail, Outlook, Yahoo, Mail.ru, Yandex, GMX, ProtonMail and shows folder placement at each. If your "95% delivered" is actually 35% inboxed, you'll see it in two minutes. Free test, no signup.

Why the dashboard can't tell you

Outreach tools measure three signals: SMTP response, open pixel, click tracker. None of these tell you folder placement.

SMTP response

2xx means accepted. Accepted by a Gmail server might mean "filed to Spam." Gmail returns 2xx for both inbox and spam. The tool cannot distinguish.

Open pixel

If Apple MPP fires the pixel automatically, you get an open event even for spam-foldered mail. If the recipient opens in Gmail with images off, you get no event for an inboxed mail. Neither case is diagnostic.

Click tracker

Security scanners at enterprise tenants click every link to sandbox them. You get click events from messages that never touched a human. Meanwhile, spam-foldered messages produce no clicks because nobody sees them.

The sales math that breaks

Founders plan pipeline off the wrong number. Here is the chain:

  1. Send 1,000 messages.
  2. Dashboard reports 950 delivered (95%).
  3. Founder expects roughly 950 humans to see something.
  4. Reality: 400 land in primary inbox. 350 go to Promotions. 150 to spam. 50 dropped silently.
  5. Of the 400 inboxed, 240 are actually opened by a human (60% of the inboxable base).
  6. Of those 240, 3–5% reply: 7–12 replies.
  7. Founder sees 7–12 replies off 1,000 sent and concludes the copy is broken, or the market is dead.

The copy might be fine. The market might be alive. The campaign never reached half the people it was supposed to.

What to measure instead

Swap "delivery rate" for "inbox placement rate" as your top-of-funnel metric. Specifically:

  • Inbox placement at Gmail: target 70%+. Below 50% — you have a deliverability problem, not a copy problem.
  • Inbox placement at Outlook: target 60%+. Outlook is stricter; below 40% you are being filtered by SmartScreen.
  • Authentication alignment: SPF pass + DKIM pass + DMARC aligned. If any fails, fix before optimising copy.
  • Reply rate on the inboxable base: replies divided by estimated inboxed count, not total sent. This is your real copy metric.

What the tool vendors will eventually do

The better outreach tools have started surfacing "inbox placement score" pulled from third-party seed networks. Treat those with appropriate skepticism — the vendor has an interest in your score looking good — but the direction is correct. A dashboard that only reports SMTP acceptance is leaving you blind to the main failure mode of cold outreach.

A 10-minute diagnosis

  1. Pull the last campaign's content.
  2. Run it through a seed test against Gmail, Outlook, Yahoo, a few regional providers.
  3. Compare inbox placement % to the "delivery rate" on your dashboard.
  4. If the gap is more than 20 points, your reply-rate problem is a deliverability problem dressed up as a copy problem.
  5. Fix deliverability first. Copy iterations on spam-foldered mail are a waste of time.

FAQ

Is 95% delivered ever meaningful?

It is a necessary condition — if you're below 95% delivered you have a clear list-hygiene or blacklist problem. Above 95% delivered, the number stops being informative, because it can't distinguish inbox from spam folder.

Why don't Gmail and Outlook tell senders the folder?

Because telling senders would let spammers optimise against the filter. The opacity is a feature of the system, not a bug. Senders have to rely on seed mailboxes or postmaster tools (aggregate signals only) to infer placement.

My Apollo/Instantly/Lemlist dashboard shows a "spam rate." Isn't that the same thing?

No. Most tool-reported spam rates are bounce-style metrics (messages rejected at SMTP for policy reasons) or user-complaint rates from feedback loops. Neither measures inbox folder placement. They under-count spam-foldered mail by roughly an order of magnitude.

How often should I re-test inbox placement?

Before each new campaign, and weekly if you're running continuous outreach. Placement drifts as providers update filters, your sending domain ages, and your engagement signals shift. A campaign that placed 70% last month can place 40% this month without any code change on your side.
Related reading

Check your deliverability across 20+ providers

Gmail, Outlook, Yahoo, Mail.ru, Yandex, GMX, ProtonMail and more. Real inbox screenshots, SPF/DKIM/DMARC, spam engine verdicts. Free, no signup.

Run Free Test →

Unlimited tests · 20+ seed mailboxes · Live results · No account required