Is Javascript Bad For SEO?

Ian Lurie

Does a bear poop in the woods? With javascript and SEO, the answer is just as clear, if a little more complicated.

Javascript-driven sites aren’t bad for indexation. Google can crawl a site that populates content client-side.

Javascript-driven client-side content is bad for SEO. Javascript-driven sites make Google work harder. At the very least, Google renders them more slowly. In the SERPs, that’s a competitive disadvantage.

To demonstrate (Ian rubs his hands together) I get to use quadrant diagrams.

If you already know how javascript works, what client-side rendering is, and how Google handles client-side rendering, print this, stick it to your forehead, and move on:

When Javascript Is Bad For SEO
quadrant

The javascript/SEO quadrant

For us mere mortals, here’s a fuller explanation:

Two Types Of Javascript

There are two ways client-side javascript—javascript executed by a web browser—can interact with web content:

UI enhancement changes the way the browser interacts with content rendered on the server. Examples include tabbed, drop down navigation, and (sigh) carousels.

Client-side rendering delivers pages and content separately. Your web browser uses javascript to merge the two.

This post talks about client-side javascript rendering and why it’s bad for SEO.

Client vs. Server-Side

Every web page is an interaction between a client (a browser, like Chrome, or a bot, like Google) and a server.

Generating a web page involves three steps:

  1. Fetch the page template (the layout)
  2. Fetch the content
  3. Merge the content with the template

Server- and client-side rendering perform these three steps differently.

Server-side rendering does all three steps on the server, then sends the result to the client. The client has everything it needs, renders the full page, and goes on its merry way.

Client-side rendering uses javascript to split the labor: It sends the template to the browser or bot, then sends the content separately. The browser uses javascript to merge the content and the template. Client-side rendering has advantages: It’s speedy (if you do it right). It’s a great way to build interactive applications. But it requires more work by the client.

Here’s our quadrant diagram so far:

Server And Client Rendering
quadrant-server-client

Server and client rendering

Static Content vs. Dynamic Interface

Some pages are just stuff: Words and pictures and links and buttons. Clicking those links and buttons send me to another page or display a form. They don’t profoundly modify the page itself. That’s static content, and it’s what you browse 90% of the time: Articles, product pages, blog posts, news, etc.

Other pages change a lot depending on my actions: A text editor, a multi-faceted search, or a page where content continually updates. A page like this is a dynamic interface. The Portent Title Generator, built by some incredible agency (cough) is an example:

The Title Generator Uses Javascript for a Dynamic Interface
javascript-title-generator

A dynamic interface using javascript

Hopefully, your SEO strategy doesn’t hinge on dynamic content. If you’re going to succeed in SEO, you need to get your static content indexed and optimized.

Static vs. dynamic is the next part of the quadrant diagram:

Static, dynamic, client- and server-side
quadrant-all-four

Static, dynamic, client- and server-side

When you combine static/dynamic and server-side/client-side, you get a feel for where and how javascript can make SEO more difficult.

When Javascript Hurts SEO

Javascript is terrible for SEO when you use client-side rendering for static content:

When Javascript Is Bad For SEO
quadrant

The javascript/SEO quadrant

Here’s why:

Static content is what you need indexed. If you can’t get a key product into the rankings, if your blog post is invisible, you’re hosed. Fortunately, Google crawls and indexes javascript-driven static content. All good.

You also need static content optimized: You need higher rankings, and that content is how you’ll get there. The trouble starts here. Google uses two-stage rendering on javascript-powered websites: It crawls the site now, renders content later. Here’s how Google’s engineers put it:

“The rendering of JavaScript powered websites in Google Search is deferred until Googlebot has resources available to process that content.”

That’s in Google’s own words at io2018. Check the video at 14:11.

Two learnings:

  • Google needs extra resources to fully crawl, render and index javascript-powered, client-side rendered pages
  • Google felt it necessary to point out that fact

Client-side rendering doesn’t hurt indexation. It hurts SEO. There’s a difference. As I said, Google can crawl javascript content, and it does. But two-step rendering puts client-side content at a competitive disadvantage. All these quadrant diagrams are making me giddy:

Indexing vs. SEO and javascript
quadrant2

Indexing vs. SEO

If you’re doing SEO, you can’t afford to end up in the bottom-right box.

If you must use client-side rendering on static content, here are two ways to reduce the damage:

Mitigation

If you must use javascript, mitigate it using prerendering or hybrid rendering.

Prerendering and user-agent detection

Prerendering works like this:

  1. Render a server-side version of each page on your site
  2. Store that
  3. When a client visits, check the user agent
  4. If the client is a search bot, deliver the prerendered content instead of the javascript-rendered content

The logic is licking-your-own-eyeball-from-the-inside tortured: If you can deliver prerendered content, why not just do that from the start? But, if you must, try Puppeteer to do prerendering, or a service like prerender.io, which does all the work for you.

Hybrid rendering

Hybrid rendering generates the first page/content server-side, then delivers remaining content client-side. Sort of. Most javascript libraries, such as Angular, support this. I think.

If you search for “hybrid rendering,” you’ll find seven million pages, each with a slightly different definition of “hybrid rendering.” For our purposes, assume it means “Deliver the most important content, then the other stuff.”

For example, you could use it for filtering. Coursera lets you filter courses without javascript:

But the interface gets speedier, and the results richer, if your browser supports javascript:

That’s not the best example. TRUST ME that hybrid rendering mixes javascript-driven and static content, delivering static content first.

When To Use Which

For static content, use server-side rendering or, if you must, prerendering. If you want to optimize content that’s in a dynamic interface (like Coursera’s course list), use hybrid rendering.

ONE LAST QUADRANT DIAGRAM:

Why Mitigation Sucks

My rule: If Google gives you ways to mitigate a thing, don’t do that thing at all.

You know your doctor can set a bone. That doesn’t mean you go out of your way to break your leg for giggles.

Google can handle javascript-driven sites. That doesn’t mean you go out of your way to render content using javascript.

If nothing else, remember that Google changes their mind.

But I am not a javascript hater. In some cases, javascript-driven pages make a ton of sense.

When You Should Use Javascript Rendering

Build a client-side javascript-driven website when interactivity is more important than rankings. Apps and app-like websites, aggregators, and filters require client-side javascript rendering. Then use hybrid rendering to deliver critical content to Google.

When You Shouldn’t Use Javascript Rendering

Don’t use javascript for static content. If content isn’t interactive—a basic product page, a blog post, news articles, and any other content that doesn’t have to instantly respond to user input—it doesn’t need client-side javascript.

That doesn’t include carousels and other stuff. That’s UI enhancement, not content delivery. Done right, it’s perfectly OK.

Testing

This will bunch up the undergarments of many SEOs, developers, search scientists, and engineers: Don’t test.

Tests make you feel better. They show you that Google can indeed render the content. Great! Hooray for you!

No. Boo for you! Because testing verifies indexing and rendering. It does not verify that you’re competitive.

If you’re using client-side javascript to deliver static content you’ve failed the test. Stop. Change it.

Ask Yourself Why

There are two lessons here:

  1. Javascript can be bad for SEO
  2. There’s a difference between SEO and indexation

If you want to compete in the rankings, don’t use client-side rendering to deliver static content, or any content for which you want to rank. Use javascript to drive app-like experiences. When you’re considering using javascript to deliver content, do a very honest assessment of the pluses and minuses.

Then remember this handy quadrant diagram. I put a lot of time into this:

When Javascript Is Bad For SEO
quadrant

The javascript/SEO quadrant

Ian Lurie
Founder

Ian Lurie is the founder of Portent. He's been a digital marketer since the days of AOL and Compuserve (that's more than 25 years, if you're counting). Ian's recorded training for Lynda.com, writes regularly for the Portent Blog and has been published on AllThingsD, Smashing Magazine, and TechCrunch. Ian speaks at conferences around the world, including SearchLove, MozCon, Seattle Interactive Conference and ad:Tech. He has published several books about business and marketing: One Trick Ponies Get Shot, available on Kindle, The Web Marketing All-In-One Desk Reference for Dummies, and Conversation Marketing. Ian is now an independent consultant and continues to work with the Portent team, training the agency group on all things digital. You can find him at www.ianlurie.com

Start call to action

See how Portent can help you own your piece of the web.

End call to action
0

Comments

  1. Is there a tool that can detect whether client-side or server-side is being used to deliver content? Or something I could look for in the source code?

Comments are closed.

Close search overlay