Medialyst_

Founder Letter

The Medialyst Manifesto: Why Everyone Should Use AI in PR

Most AI in PR was built to spam harder. What if it was pointed in the opposite direction?

Elvis SunElvis Sun, Founder at Medialyst

I know how that title sounds.

Every AI tool you've seen in PR was built to spam harder—"relentless follow-ups," fake expert quotes, journalists drowning in 300+ pitches a day.

The AI you've been exposed to was designed to increase volume, not quality.

But what if AI was pointed in the opposite direction? Not "send more pitches faster"—but "find the 10 journalists who'd actually care."

I spent the last eight years at Google building AI systems for millions of users. So instead of arguing theory, I ran two experiments to find out if this was even possible.

Experiment 1: IsMyPitchShit.com

At the start of 2026, I built a free tool called IsMyPitchShit.com. You paste in your pitch and the journalist you're targeting, and AI roasts you like a fed-up editor who's had enough.

I launched it on Reddit—where the industry's most skeptical professionals hang out.

You'd think people would hate yet another AI tool for PR.

Instead, they loved it. Because AI was finally giving the brutal honest feedback that PR people have needed for years—but nobody had the guts to give.

Reddit post about IsMyPitchShit.com
Reddit post about IsMyPitchShit.com

But here's what actually mattered: after every roast, we asked users one question—"Was this accurate?"

828 pitches later:

  • 81% scored as "garbage" (59/100 or lower)
  • 32/100 was the average pitch score
  • At scores of 80+, 100% of users agreed with the AI

Pitch score statistics
Pitch score statistics

Pitch score distribution
Pitch score distribution

The AI wasn't just being harsh. PR pros agreed with its judgment.

Here's the crazy part. If every PR team ran their pitches through this filter before hitting send, journalists would receive 81% fewer garbage pitches.

They wouldn't hate AI for that—they'd thank you for using it.

The same algorithm that roasted those pitches? It's what gives you the relevance score you see in Medialyst.

Experiment 2: The Coverage Prediction Test

Okay, anyone can build an AI that criticizes. The harder question: Can AI predict which journalists will actually care about a story?

I needed to prove this before writing a single line of Medialyst code.

So I took 5,000+ real PR campaigns from 2024-2025, thanks to a list from PRinsider ("pr campaign masterlist" under resources), and ran a blind test.

The list of 5000+ PR campaigns across all industries
The list of 5000+ PR campaigns across all industries

Here's what I did:

  1. Extract the coverage - find the actual article that ran
  2. Extract the journalist - who actually wrote the piece
  3. Simulate the pitch - have AI write a press release based on the coverage (what a PR person would have sent)
  4. Block the AI - prevent AI from seeing that specific coverage article, that URL, or finding it via Google News
  5. Limit the time window - AI can only analyze the journalist's previous work

The question: Given just the press release and the journalist's past articles, can AI predict this journalist would cover this story?

One week of testing results across different models and configurations
One week of testing results across different models and configurations

After testing dozens of model configurations, the best performer:

  • 93.2% precision—when AI says "would cover it," it's right
  • 82.1% recall—catches 4 out of 5 real matches
  • Tested on 5,000+ campaigns from 2024-2025

(Why not 100%? There's always a tradeoff: catch every possible journalist and you risk including more who don't fit. We focus on giving you 10 great matches instead of 15 with 5 duds.)

What this proved:

AI can understand what journalists actually want to cover—not just what their bio says, but what they've demonstrated through their recent writing.

That coverage prediction AI eventually evolved into the Medialyst you see today.

The Real Problem

The problem isn't AI. It's how AI has been deployed.

Tools like Olivia Brown optimize for spam. Legacy databases like Cision and Muck Rack charge $10,000+/year for glorified phone books that don't tell you what a journalist wrote about this week. Neither is solving the actual problem.

We're building for a different world—where AI does 90% of the research and humans take it through the last mile:

  • Journalists get fewer, better pitches—and actually have time to respond
  • PR professionals stop burning hours on research and start focusing on relationships
  • Small voices finally get access to media—because AI closes the "craft gap" that used to require a $15K/month agency retainer
  • The metric that matters is not send volume, it's journalist reply rate.

That's the world we're building toward.


February 4, 2026

— Elvis

Signature