Home - Blog - SEO from Behind the Paywall

SEO from Behind the Paywall

SEO from Behind the Paywall

How to get Google Indexing From Behind your Paywall

The specter of optimizing for content behind paywalls has never really played a big role in SEO practice as most clients are businesses who trade on things besides their content, and so have no incentive to monetize their websites. However, when one of my developers came to me this morning asking about how to get Google’s spiders across a paywall, my initial reaction was “if you want it indexed, don’t put up a paywall”, and while that’s a fairly good answer, I knew that it wasn’t 100% true, but it definitely got me thinking about it

Reasons to have a paywall

Paywalls have been growing increasingly utilized on websites for traditional publishers. The Wall Street Journal was one of the first sites to have a paywall. Harpers have never had a paywall preferring instead to ‘preserve the value of their content’. The goal being of course the monetization of publishers’ online presence, but the problem is that not only do paywalls severely limit visibility in Google’s search engine results, but it makes the content difficult to share on social media platforms, meaning that new reader acquisition is crippled. However, neither the Wall Street Journal nor Harpers are hurting because of it (the former posting one of the only increases of circulation in 2013 among news papers—with a big proportion of readers coming from online).

However, not everyone can be the Wall Street journal, and so must monetize while having some kind of plan in place for new reader acquisition—a plan that involves people finding your articles online through Google.

How to get Google spiders past paywalls

Seeing this as a problem Google worked with these publishers to develop a work around called ‘First Click Free’ (FCF) which allows restricted content to appear in search indexes, but only allows searchers to read the article they’ve searched for before being prompted to create an account (and pay). Equally, since Google’s spiders are unable to complete registration forms, in order to crawl your site configuration of webservers to not serve the registration page to the spiders.

If for whatever reason, FCF is not implementable on your website you can make snippets available to Google news (at least) while having Google display a ‘subscription’ tag next to your publication’s name in the SERPs. However, if you are displaying the 90 word snippet to readers, while displaying the whole article to Google spiders, you will be in violation of Google’s webmaster guidelines and penalized accordingly. Therefore, if you choose this method Google will only be able to index your snippet—so make it good.

Finally, you can have Google spiders crawl your sitemap only—specifying access levels for your articles as per their subscription-only status.

This was a short blog, but a longish answer to my developer who wanted to know if there was a work around for subscription sites to be crawled my Google spiders—kinda, but not really.

Share this article