Ismat Samadov
  • Tags
  • About
16 min read/2 views

Why Most Tech Blogs Are Dead (And How to Beat Them with SEO)

Tech publications lost 58% of Google traffic since 2024. Here is why the old model broke and the SEO playbook that still works.

SEOBloggingContent MarketingWeb Development

Related Articles

SEO on $0: How I Get Traffic Without Paying for Ads

17 min read

I Built a Backlink Engine for $0: How reklamyeri.az Generates Free SEO Authority

12 min read

Next.js vs Remix vs Astro in 2026: The Framework War Nobody Is Winning

14 min read

Enjoyed this article?

Get new posts delivered to your inbox. No spam, unsubscribe anytime.

On this page

  • The Numbers: How Bad Is It, Really?
  • Why the Big Publications Are Dying (And You're Not Them)
  • The SEO Playbook That Works in 2026
  • Write Content That AI Can't Summarize
  • Get Your Technical SEO Right
  • Add Structured Data (JSON-LD)
  • Generate a Proper Sitemap
  • robots.txt: Don't Forget the Basics
  • Target AI Search Engines, Not Just Google
  • Your Blog vs. The Platforms: Where Should You Actually Write?
  • The Mistakes I See Developers Make
  • The Content Strategy That Actually Compounds
  • The Blogs That Will Survive (Pattern Recognition)
  • What I Actually Think
  • Sources

© 2026 Ismat Samadov

RSS

The Verge lost 85% of its Google traffic. ZDNet lost 90%. Digital Trends lost 97%. These aren't random blogs run by hobbyists. These are publications with full editorial teams, millions of pages of content, and decades of domain authority. And Google just... stopped sending them readers.

Tech publications collectively lost 58% of their Google traffic since 2024. Google traffic to publishers overall dropped by a third in 2025 alone. Fifty-nine percent of websites saw their traffic decline last year.

If you run a tech blog — or you're thinking about starting one — you're probably wondering if there's any point. The big publishers are bleeding out. AI is eating search results. Content mills are drowning in AI-generated slop.

Here's my take: most tech blogs deserve to die. They've been publishing the same rewritten press releases and surface-level tutorials for years. The ones that survive this shakeout will be smaller, more opinionated, and better at SEO than the incumbents ever were. And that's actually great news for individual developers who write from experience.

Let me show you exactly why the old model broke, what's replacing it, and how to build a blog that thrives in 2026's search reality.


The Numbers: How Bad Is It, Really?

Bad. But not uniformly bad. The data tells a more interesting story than "everything is dying."

Here's what happened to some of the biggest tech publications:

PublicationGoogle Traffic Loss (Since 2024)Source
Digital Trends97%Nieman Lab
ZDNet90%Nieman Lab
The Verge85%Nieman Lab

Meanwhile, there are 600 million blogs worldwide pushing out 7.5 million posts every single day. The content marketing industry is worth $107.5 billion by 2026. More content is being produced than ever before, yet the channels to distribute it are shrinking.

So where did all that traffic go? Three places.

1. AI Overviews ate the clicks. Zero-click searches grew from 56% to 69% between May 2024 and May 2025. When Google answers the question right on the search results page, nobody clicks through. Organic click-through rates plummeted 61% for queries where AI Overviews appear. The top organic link's CTR drops roughly 79% when there's an AI Overview sitting above it.

2. AI Overviews are everywhere now. They appeared on roughly 48% of tracked queries by February 2026. Half of all searches. That's not a test anymore. That's the new default.

3. People started going straight to AI. ChatGPT, Perplexity, Claude — these aren't just novelty tools anymore. They're becoming primary research tools. Why read five blog posts about "how to set up ESLint" when you can ask an AI and get a working config in 10 seconds?

But here's the thing the doom-and-gloom articles miss: 77% of internet users still read blogs regularly. Marketers who prioritize blogging are 13x more likely to see positive ROI. B2B companies with blogs generate 67% more leads than those without.

The audience didn't disappear. The distribution changed. And that means the strategy has to change too.


Why the Big Publications Are Dying (And You're Not Them)

The Verge didn't lose 85% of its traffic because blogging is dead. It lost traffic because its content model — high volume, broad coverage, SEO-optimized for informational queries — is exactly the kind of content AI Overviews replace.

Think about what these publications publish: product roundups, spec comparisons, "what is X" explainers, news rewrites. Every one of those query types is now answered directly by Google's AI before the user ever scrolls to a blue link.

Large publications have another structural problem: they optimized for Google's old algorithm. Hundreds of writers producing thousands of articles per month, each targeting a specific keyword cluster. That worked when Google's primary function was pointing people to web pages. It doesn't work when Google's primary function is answering questions itself.

This is actually an advantage for small, independent blogs. Here's why:

You can write from real experience. Google's EEAT framework (Experience, Expertise, Authoritativeness, Trustworthiness) now rewards smaller blogs with genuine lived experience over faceless corporate content. A staff writer at Digital Trends reviewing their 47th laptop this month can't compete on "experience" with an engineer who actually uses that laptop daily for development work. Google knows the difference now.

You can hold opinions. AI Overviews summarize consensus. They can't replicate a strong, well-argued opinion. When I write "PostgreSQL is the only database you need" and back it with production experience, that's content AI can't generate from averaging other sources. The big publications trained their writers to be neutral and objective. In 2026, that neutrality is a death sentence because it's exactly what AI already does.

You can go deep. A publication chasing pageviews needs broad appeal. You don't. You can write 4,000 words about a specific Drizzle ORM migration pattern because your readers are the exact people who need that information. That specificity is what AI Overviews struggle to replace.

You can move fast. No editorial approval process. No legal review. No waiting for the SEO team to approve your keyword targeting. You see a gap, you write about it, you publish it. By the time a publication's editorial calendar catches up, you've already been indexed.

You can build an email list. This is the insurance policy nobody talks about. When Google changes the algorithm (and they will, again), your email subscribers don't disappear. The big publications never invested in direct audience relationships because they didn't need to — Google sent them traffic for free. Now that the faucet is turning off, they have no fallback. You can build yours from day one.


The SEO Playbook That Works in 2026

Everything I'm about to describe, I run on this blog. This isn't theory. These are the techniques generating real organic traffic right now.

Write Content That AI Can't Summarize

The single most important shift in SEO strategy: stop writing content that can be reduced to a two-sentence answer.

Content that gets killed by AI Overviews:

  • "What is Docker?"
  • "JavaScript vs TypeScript comparison"
  • "How to center a div in CSS"

Content that survives:

  • "I Migrated 50 Docker Containers to Podman — Here's What Broke"
  • "Why I Switched Back to JavaScript After 2 Years of TypeScript"
  • "The CSS Layout Strategy I Use on Every Project (With Code)"

The difference is experience. The first category answers factual questions. AI does that better than any blog post. The second category shares specific, lived experience that AI can't fabricate. Every piece you publish should include something only you know from doing the work.

The data backs this up: the average page ranking #1 on Google has 1,447 words. Content over 3,000 words gets significantly more backlinks. Long-form content between 1,000-2,000 words gets 56% more social shares. Depth wins. Surface-level answers lose.

Get Your Technical SEO Right

I run this blog on Next.js. Here's the exact metadata pattern I use for every article:

// src/app/blog/[slug]/page.tsx
import type { Metadata } from 'next'

interface Props {
  params: Promise<{ slug: string }>
}

export async function generateMetadata({ params }: Props): Promise<Metadata> {
  const { slug } = await params
  const post = await getPostBySlug(slug)

  if (!post) return { title: 'Not Found' }

  return {
    title: post.seoTitle || post.title,
    description: post.seoDescription || post.excerpt,
    openGraph: {
      title: post.seoTitle || post.title,
      description: post.seoDescription || post.excerpt,
      type: 'article',
      publishedTime: post.publishedAt?.toISOString(),
      authors: ['Ismat Samadov'],
      tags: post.tags,
    },
    twitter: {
      card: 'summary_large_image',
      title: post.seoTitle || post.title,
      description: post.seoDescription || post.excerpt,
    },
    alternates: {
      canonical: post.canonicalUrl
        || `https://ismatsamadov.com/blog/${post.slug}`,
    },
  }
}

Every page needs a unique title tag, meta description, canonical URL, and Open Graph data. Next.js makes this dead simple with the generateMetadata function. If you're using a different framework, the specifics change but the requirements don't.

Add Structured Data (JSON-LD)

Schema markup increases CTR by 20-40% through rich results. Rich results are those enhanced search listings with star ratings, FAQ dropdowns, breadcrumbs, and author information. They take up more visual space in search results and get more clicks. Google recommends JSON-LD as the preferred format.

Here's the JSON-LD I add to every blog post:

// components/ArticleJsonLd.tsx
export function ArticleJsonLd({ post }: { post: Post }) {
  const jsonLd = {
    '@context': 'https://schema.org',
    '@type': 'Article',
    headline: post.title,
    description: post.excerpt,
    author: {
      '@type': 'Person',
      name: 'Ismat Samadov',
      url: 'https://ismatsamadov.com',
    },
    datePublished: post.publishedAt,
    dateModified: post.updatedAt || post.publishedAt,
    publisher: {
      '@type': 'Person',
      name: 'Ismat Samadov',
    },
    mainEntityOfPage: {
      '@type': 'WebPage',
      '@id': `https://ismatsamadov.com/blog/${post.slug}`,
    },
    wordCount: post.content.split(/\s+/).length,
    timeRequired: `PT${post.readingTime}M`,
  }

  return (
    <script
      type="application/ld+json"
      dangerouslySetInnerHTML={{ __html: JSON.stringify(jsonLd) }}
    />
  )
}

And here's an FAQ schema for posts that answer specific questions:

// components/FaqJsonLd.tsx
interface FaqItem {
  question: string
  answer: string
}

export function FaqJsonLd({ items }: { items: FaqItem[] }) {
  const jsonLd = {
    '@context': 'https://schema.org',
    '@type': 'FAQPage',
    mainEntity: items.map((item) => ({
      '@type': 'Question',
      name: item.question,
      acceptedAnswer: {
        '@type': 'Answer',
        text: item.answer,
      },
    })),
  }

  return (
    <script
      type="application/ld+json"
      dangerouslySetInnerHTML={{ __html: JSON.stringify(jsonLd) }}
    />
  )
}

Generate a Proper Sitemap

A sitemap tells search engines exactly what pages exist and when they were last updated. Here's the sitemap.ts I use with Next.js App Router:

// src/app/sitemap.ts
import type { MetadataRoute } from 'next'

export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
  const posts = await getAllPublishedPosts()

  const blogEntries = posts.map((post) => ({
    url: `https://ismatsamadov.com/blog/${post.slug}`,
    lastModified: post.updatedAt || post.publishedAt,
    changeFrequency: 'monthly' as const,
    priority: 0.8,
  }))

  const staticPages = [
    { url: 'https://ismatsamadov.com', priority: 1.0 },
    { url: 'https://ismatsamadov.com/blog', priority: 0.9 },
    { url: 'https://ismatsamadov.com/about', priority: 0.7 },
  ].map((page) => ({
    ...page,
    lastModified: new Date(),
    changeFrequency: 'weekly' as const,
  }))

  return [...staticPages, ...blogEntries]
}

No XML files to maintain manually. No plugins. Next.js generates it from your data at request time. Submit this URL to Google Search Console once and forget about it.

robots.txt: Don't Forget the Basics

While you're handling sitemaps, make sure your robots.txt isn't blocking anything important. I've seen developers accidentally block their blog directory or their API routes. Here's a minimal robots.txt for a Next.js blog:

# public/robots.txt
User-agent: *
Allow: /

Sitemap: https://ismatsamadov.com/sitemap.xml

If you're using the App Router, you can also generate this dynamically with a robots.ts file, but for most blogs a static file works fine. The key is making sure your sitemap URL is declared here so crawlers find it automatically.

Target AI Search Engines, Not Just Google

Here's a strategy almost nobody talks about yet: optimizing for AI search engines. ChatGPT with browsing, Perplexity, and other AI tools are becoming meaningful traffic sources. When someone asks Perplexity "what's the best way to set up a Next.js blog in 2026," it pulls from indexed web pages and cites them with links.

How to get cited by AI search:

  • Write clear, factual claims with sources. AI search engines prefer content they can verify and cite confidently.
  • Use descriptive headings. AI parses document structure heavily. "Step 3: Configure PostgreSQL Connection Pooling" is better than "The Database Part."
  • Include comparison tables. AI search engines love structured data they can reference directly.
  • Publish original data. If you benchmark something, measure something, or survey people, AI tools will cite your original findings.
  • Be the primary source. Don't summarize other people's content. Create the content others reference.

And here's a stat that makes this tangible: brands cited in AI Overviews earn 35% more organic clicks than those that aren't. Being the source AI quotes is the new SEO gold standard.


Your Blog vs. The Platforms: Where Should You Actually Write?

This is a question I get asked constantly: "Should I write on Medium? Dev.to? Hashnode? Or my own blog?"

The answer depends on what you want. Here's the honest comparison:

FactorPersonal Blog (Next.js/WordPress)MediumDev.to
SEO controlFull (title tags, meta, schema, sitemap)None (can't customize title tags or meta descriptions)Limited
Long-term organic traffic2.5x more than platformsDeclining (Medium's domain authority helps, but you don't own rankings)Good for immediate reach
Audience ownershipYou own the email list, the domain, the dataMedium owns the audienceDev.to owns the audience
MonetizationFull controlMedium Partner Program (declining payouts)None built-in
Setup effortHours to daysMinutesMinutes
CustomizationUnlimitedAlmost noneLimited

WordPress sites see 2.5x more organic traffic long-term compared to platform-hosted content. The reason is simple: you control every SEO lever. Title tags, meta descriptions, schema markup, internal linking, sitemap structure, canonical URLs, page speed — all of it.

Medium is the worst option for SEO because you can't customize title tags or meta descriptions. Dev.to is better for immediate reach within the developer community, but you're building on rented land. If Dev.to changes their algorithm or shuts down, your content goes with it.

My recommendation: own your blog, cross-post to platforms. Write the canonical version on your domain first, wait for Google to index it (usually 24-48 hours), then cross-post to Dev.to and Medium with a canonical URL pointing back to your site. You get the platform audience and the SEO benefits. Best of both worlds.

One more thing about platforms: Substack is interesting but has the same fundamental problem as Medium — you don't control the domain or the SEO. The audience you build lives on their infrastructure. If Substack changes monetization rules, raises prices, or goes under, you lose everything. An email list on your own domain with something like Buttondown or ConvertKit gives you the same newsletter functionality without the platform risk.


The Mistakes I See Developers Make

Before I lay out the content strategy, let me save you from the pitfalls I see constantly.

Publishing without a plan. A random blog post every few months does nothing. Google rewards consistency. One post per week for six months beats thirty posts in one week followed by silence.

Writing for other developers when you should write for searchers. Your Twitter mutuals already know what Kubernetes is. The person Googling "why is my Kubernetes pod stuck in CrashLoopBackOff" doesn't. Write for the second person. That's where the search traffic is.

Ignoring internal linking. Every new post should link to 3-5 existing posts on your blog. This distributes page authority across your site and helps Google understand your content structure. It's free. It takes 2 minutes per post. Almost nobody does it.

Skipping structured data. I showed you the JSON-LD code earlier. Most developer blogs don't add any structured data at all. That means they're leaving 20-40% CTR improvement on the table. Rich results (star ratings, FAQ dropdowns, author info) take up more visual space in search results. More space means more clicks.

Optimizing for the wrong keywords. Don't target "JavaScript tutorial" — you'll never outrank MDN or freeCodeCamp. Target "how I debugged a memory leak in my Node.js production server." Long-tail, experience-based queries are where small blogs win. The search volume is lower per query, but the competition is near zero and the intent is high.

Not tracking anything. You need Google Search Console (free) and some form of analytics. Without data, you're guessing. Search Console tells you which queries bring traffic, which pages are indexed, and where you rank. That data drives every content decision.


The Content Strategy That Actually Compounds

Posting randomly doesn't build traffic. Here's the system I follow:

Week 1-2: Keyword research from the source. I don't use expensive SEO tools. Google Search Console shows me exactly what queries people use to find my existing content. I look for queries where I rank positions 8-20 — that's "striking distance." A content update or a new targeted post can push those to page 1.

Week 3-4: Write one long-form post targeting a striking-distance keyword. Not a 500-word quickie. A 2,000-4,000 word piece with original code, comparison tables, real data, and a strong opinion. The kind of post that takes 3-4 hours to write but earns traffic for years.

Ongoing: One post per week, minimum. Consistency matters more than volume. Each new post internally links to 3-5 existing posts, spreading authority across the site.

Monthly: Update the top 5 performing posts. Add fresh data. Fix broken links. Expand sections that could go deeper. Google rewards freshness — an updated post often jumps in rankings within days.

Here's a realistic timeline for what to expect:

MonthWhat Happens
1-2Almost nothing. Google is indexing your content. Traffic is near zero.
3-4First trickle of organic traffic. A few posts start appearing on page 2.
5-6Some posts reach page 1. Traffic starts compounding.
7-9The flywheel is turning. Each new post gets indexed faster and ranks quicker.
10-12Meaningful traffic. Some posts generate hundreds of visits per month.

The mistake most people make: they quit in month 2. They publish 10 posts, see no traffic, and conclude blogging is dead. It's not dead. It's slow. Those are different things.


The Blogs That Will Survive (Pattern Recognition)

After watching this shakeout play out over the past two years, I see clear patterns in which blogs survive and which don't.

Survivors share these traits:

  • First-person experience ("I built this, here's what happened")
  • Strong technical depth (code that actually works, not pseudocode)
  • Opinionated takes backed by data
  • Consistent publishing schedule (weekly or biweekly)
  • Proper technical SEO (structured data, fast load times, mobile-optimized)
  • Email lists (owning the audience, not renting it from Google)

The dead blogs share these traits:

  • Rewritten documentation (AI does this better)
  • News aggregation without analysis (Google News and AI do this better)
  • Listicles with no original insight ("Top 10 JavaScript Frameworks")
  • No structured data, slow load times, no mobile optimization
  • Published sporadically with no strategy

The survivors aren't necessarily the ones with the biggest audiences. They're the ones with the most specific audiences. A blog that 500 Rust developers read every week is worth more than a blog that 50,000 random people glance at once.


What I Actually Think

The death of tech blogs is the best thing that could happen to tech blogging.

For the past decade, the tech blogosphere has been dominated by content farms optimizing for Google's algorithm. They published thousands of articles per month, each targeting a specific keyword, each written to hit a word count, each as generic and inoffensive as possible. They weren't writing for humans. They were writing for crawlers.

Google's AI Overviews killed that model. Good. It deserved to die.

What's replacing it is better: individual voices with real expertise, writing about things they actually do, for audiences that actually care. The economics changed. When Google sent millions of clicks to anyone who could stuff keywords into 2,000 words, there was no incentive to be good. Now there is. The only content that survives AI Overviews is content that's genuinely better than what AI can generate from its training data.

That bar is higher than most writers realize. AI can write a perfectly adequate "Getting Started with Docker" tutorial. It can't write "I Ran Docker in Production for 3 Years and Here Are the 11 Things That Broke." The first is information. The second is knowledge. Information is free now. Knowledge still has value.

My prediction: by 2027, the tech blogs that matter will be small — run by one or two people who work in the field. They'll publish less frequently but with more depth. They'll make money from consulting, courses, or sponsorships rather than ad impressions. And they'll get more organic traffic than the publications they replaced, because Google's algorithm now genuinely rewards quality over quantity.

If you're thinking about starting a tech blog in 2026, the window is actually wide open. The incumbents are collapsing. AI Overviews killed the content farms. The field is clearing. But you have to play the new game: real experience, technical depth, proper SEO, and patience.

The blogs aren't dead. The lazy ones are. Don't be lazy.


Sources

  1. Tech Media Collapse: 58% Traffic Loss — Growtika
  2. Traffic to Top Tech Publications Has Plummeted — Nieman Lab
  3. Google Traffic Down 2025: Trends Report — Press Gazette
  4. 2025 Organic Traffic Crisis Analysis Report — Digital Bloom
  5. Google AI Overviews: The End of Traditional CTR — Dataslayer
  6. AI Overviews Surge Across 9 Industries — ALM Corp
  7. Number of Blogs Worldwide — Blog Herald
  8. Content Marketing Statistics 2026 — Typeface
  9. Blogging Statistics 2026 — OptinMonster
  10. Business Blogging Statistics — DemandSage
  11. SEO Trends 2026: EEAT and Personal Blogs — Marketer Milk
  12. Schema Markup Detailed Guide 2026 — ALM Corp
  13. Content Length and SEO Rankings — SEO.co
  14. Structured Data and JSON-LD Implementation Guide — Cosmic
  15. Personal Blog vs Medium: Where to Write — High Approach