I want Google to be able to index my protected content to increase traffic from organic search. It should be easy enough to allow the Google Spider to pass into the protected folder. But I've heard that if Google indexes a page and users see different content than what the Google spider saw, you get penalized by Google. How are other aMember users setting things up so their site gets indexed by spiders, users get access to some content when following those organic search results to their site, but then the protection kicks in at a certain point. I'd love to see some good examples. Andrew
While it is easy to do technically, it is against Google policy. The content they scan must be the same content that is available to public users. A compromise is the use of teaser text where you have a few sentences or a paragraph of content with a call to action to click through to view the rest of the content. You can do this with the Wordpress -> aMember integration with the help of a few Wordpress plugins.
Skippy is right. What you want is teaser text which I call the Magazine Model. You can create a walled garden where 100% of the content is protected, and do some coding to allow google in but, people will fake the google robot and see your content. David
Thanks for the replies. I'm thinking of setting up a parallel version of my content that is configured to allow access for a limited period of time. It would use a combination of cookies and logging the user's IP address. On first visit, date cookie would be set and IP address logged. At end of defined period (ie. 3-7 days), they would be denied access and instead forwarded to a sign up page. If they deleted cookies, the IP address log would block their access after the time period had elapsed. Would this violate Google's policy? What do you think of this approach? Re: Walled Garden strategy. I think the extra business generated by a site fully indexed by Google and the resulting visitors from organic search results would outweigh the few who would get themselves free access by dressing up as the Googlebot. (Hey Halloween's coming up. What does the Googlebot look like anyway). Andrew
Couple of thoughts, 1) Google does not crawl all of your content in one swipe so you run the risk of getting a partial index. That being said, when they return to index your site you may see all of your indexed content go away and replaced with signup page. Since that content will be the same for all of your pages, then you run into issue with duplicate content, etc. 2) While it is indexed, can't folks just "view cached content" to see it? I'm still recommending a mix of free content and teaser text to get organic. It works for near all the big players that are in this kind of market.
My site is akin to a dictionary that folks come to for reference. It's kind of hard to just have teaser text for a dictionary because people don't browse a dictionary they search in it. It would be great if Google would index the whole site, point people to a page that was displayed unprotected initially but then started a countdown for that individual until they'd no longer be able to get unrestricted access. I think there's a way (a tag perhaps) that tells search engines not to save cached content. A.
<META NAME="ROBOTS" CONTENT="NOARCHIVE"> <META NAME="GOOGLEBOT" CONTENT="NOARCHIVE"> But again, people could still spoof the google robot to access the site... David