Zombie Stats: The Made-Up Numbers Living in Your Slides

That 65% of people are visual learners stat in your deck? It is made up. Here are 10 zombie statistics that have been circulating for years with no real source, and why AI tools are now spreading them faster than ever.

You've seen the slide. You've probably used the slide.

"Only 7% of communication is words. 38% is tone. 55% is body language."

It shows up in communication workshops, sales training decks, TEDx talks, LinkedIn carousels. It sounds credible because someone named Albert Mehrabian is attached to it, and he's a UCLA professor, so obviously he knows what he's talking about.

Here's the thing. Mehrabian's research was about one specific scenario: judging whether a single spoken word was liked or disliked when tone and expression contradicted each other. That's it. A single word. In a controlled lab setting.

He has literally gone on record saying that applying his percentages to general communication is a misinterpretation of his work. The man whose name is on this stat doesn't even agree with how it's being used.

But it lives on. Slide after slide after slide.

Welcome to the world of zombie statistics presentations.


What makes a stat a zombie?

A zombie stat isn't just wrong. Wrong is forgivable. A zombie stat is wrong, unsourced, emotionally compelling, and completely unkillable.

It survives because:

  1. It's a round number (55%, 90%, 7%, 65%)
  2. It confirms something people already believe
  3. Nobody in the room knows the source
  4. The person citing it got it from someone who got it from someone who screenshotted an infographic

And here's the kill shot: once a stat is in enough slide decks, it becomes self-referencing. Blog cites blog, which cites blog, which eventually cites the first blog. The paper trail forms a circle and points directly at nothing.

I've spent some time digging into this, and the pattern is depressingly consistent. Let me walk you through some of the ones you've almost certainly seen.


Debunking the Usual Suspects: Zombie Statistics in Presentations

"65% of people are visual learners"

Where it comes from: nobody knows. It likely originated from a self-reported preference survey, not cognitive science research. The whole concept of fixed "learning styles" (Visual, Auditory, Kinesthetic) was substantially debunked by the Association for Psychological Science in 2008. The 65% figure has no traceable peer-reviewed origin. It circulates almost entirely through marketing infographics, one citing the next.

"People remember 65% of information with visuals vs. 10% with text alone"

This one was traced by researcher Jonathan Schwabish back to a 1996 OSHA memo. An internal guidance document. Not a study. The numbers have since been copy-pasted across the internet by people who never found the memo and wouldn't know what to do with it if they did. (There IS real research showing visuals improve retention. Just not these specific numbers.)

"The average human attention span is 8 seconds, shorter than a goldfish"

TIME magazine published this in 2015. The source was a Microsoft Canada report, which was produced by Microsoft's advertising division. Not peer-reviewed research. And the goldfish comparison within that report had no citation at all. Literally no source for the goldfish number inside the document that became the source for the human number. Also: humans clearly sustain attention for hours. You're reading this right now. Binge-watching is a thing.

"Content marketing costs 62% less and generates 3x more leads"

This one comes from a Demand Metric infographic. When Brafton investigated in 2020, three of the four source links at the bottom of the infographic were dead (404 errors). The one that worked pointed back to Demand Metric's own resource center, which cited the same infographic. It's circular. The original methodology, sample size, and definition of "content marketing" are simply not available. This stat has been on every content marketing pitch deck for over a decade.

"90% of startups fail"

When you trace the source chain: Forbes article cites Startup Genome, which cites Small Biz Trends, which cites nothing. The actual US Bureau of Labor Statistics data shows around 20% of businesses fail in year one, roughly 45% by year five. The 90% figure may apply to a specific subset of VC-backed tech startups in specific conditions, but it gets applied universally to all startups everywhere, which is just not what the data shows.

"90% of restaurants fail in their first year"

This one is wild. According to an investigation by Revenue Memo, the figure appears to have originated from an American Express TV commercial in 2003. Not a study. An ad. Academic research tells a completely different story: a 2005 Cornell study found about 26% first-year failure. A 2014 Ohio State study found 17%. The number that has terrified every aspiring restaurateur for two decades came from an advertisement.

"The average person sees 5,000 ads per day"

The actual estimate from the Yankelovich consultant who mentioned this was a rough count of brand impressions (logos on packaging, signage, clothing, etc.), not ads consciously processed. Researchers who have actually tried to measure this put the number of actively processed advertising messages in the low hundreds. The Drum's investigation in 2023 concluded the thousands-range figures were "conversational items" never intended as hard data.

"The Marketing Rule of 7"

The story goes that 1930s Hollywood studios proved through research that customers need to see your message seven times before buying. Nobody can identify the study. Nobody names the studio. Nobody cites the researchers. Multiple sources attribute it to "the 1930s" and hedge immediately with "according to some sources." It's passed down like oral tradition, except with worse accuracy. And depending on who you ask, the number is now 13, or 20, or 27, which is how you know it was never empirical to begin with.

"We only use 10% of our brain"

Neuroscience has definitively buried this one. Every brain region is active over the course of a day. Damage to any part of the brain, even small parts, causes measurable functional loss. The Einstein attribution is fabricated. His archive has no record of it. But the myth persists because it's a good story . it implies unlimited untapped potential, which is exactly what self-help content wants to sell you.


Why Zombie Statistics in Presentations Matter More Now

Here's where it gets worse.

We recently tested six AI-powered presentation tools to see how they handle facts and sources. The results were not great. You can read the full breakdown here, but the short version is: these tools hallucinate, they misattribute, and they confidently present wrong information with no disclaimers.

But here's the thing that surprised me most when digging into zombie stats: AI tools aren't always inventing new bad statistics. A lot of the time, they're amplifying the zombie stats that are already everywhere online.

These stats have been on thousands of blog posts and slide decks for years. They're in training data. They look authoritative because they're everywhere. When an AI tool generates a slide about communication, the Mehrabian rule is right there, ready to be inserted. When it builds a slide about marketing, the 62% content marketing stat shows up like clockwork. When it creates a slide about startups, the 90% failure rate appears with complete confidence.

AI accelerates the citation problem because it doesn't know the source is broken. It doesn't know the infographic had dead links. It doesn't know the study was an OSHA memo or an AmEx commercial. It just knows the number has appeared in a lot of places, so it must be true. This is part of a broader trust crisis in how we use AI-generated content.

Lazy citation culture has existed for decades. AI just found the afterburner button.


What you can actually do about it

I'm not going to pretend the solution is easy. These stats are everywhere, they sound right, and digging into source chains is genuinely tedious.

But a few things help:

Follow the citation to the actual source. Not the blog post that cited the blog post. The original study, report, or data. If you can't find it after two clicks, that's a red flag.

Be suspicious of round numbers. Real research rarely produces exactly 65% or exactly 90%. Real data is messier. When a stat is suspiciously clean, check it.

Watch for circular citations. If three sources cite each other and nobody cites primary research, the primary research probably doesn't exist.

Use AI tools with healthy skepticism. If an AI presentation tool drops a statistic into your slide, treat it as a starting point, not a finish line. Go find the source before you stand up in front of a room and say it out loud.

Replace zombie stats with honest framing. Instead of "65% of people are visual learners," say "most people find visuals easier to process" and move on. You don't need a fake number to make a true point.


The bigger issue

Most people spreading zombie stats aren't trying to deceive anyone. They saw the number somewhere that looked reputable. They assumed someone upstream had done the checking. They used it.

That's the whole problem. Everyone assumes someone upstream checked it. Nobody checked it. The original upstream source was an ad, or an infographic with dead links, or a professor who has spent years correcting how his research is being used.

Every time a zombie stat shows up in an investor pitch, a keynote, a training deck, or a viral LinkedIn post, it erodes something quietly. Not dramatically. Just a little bit at a time.

The moment someone in your audience knows the stat is wrong, they stop listening to you quite as carefully. They start wondering what else you haven't verified. You lose the room in a way that's almost invisible, and almost impossible to recover.

Credibility in presentations is hard to build and easy to wreck. A made-up number from a TV commercial, spoken confidently from a stage, does the wrecking for free.

Check your slides. Find the actual study. If you can't find it, cut the stat or frame it honestly.

The graveyard is full. You don't have to keep raising the dead.


The line worth sharing:

"Everyone assumed someone upstream checked it. Nobody checked it. That's how a 2003 AmEx commercial became a business statistic that lives in PowerPoint decks to this day."


Neel Bhatt is the author of this piece. LayerProof is building presentation tools that help you create decks you can actually stand behind. Learn more about verifying AI data for presentations

Want to get on the
LayerProof waitlist early?

Contact us