Your Web Analytics, Your Duplicate Content
I managed to skip most of the analytic seminars at SES Chicago (after all, Brett Crosby and Matt Belkin didn’t even show up as advertised to the analytics vendors seminar), and so didn’t hear a lot about analytics (except here.) Every once in a while, someone would say, “Check your analytics before you decide.”
The really interesting “Check your analytics” comment was with regard to duplicate content. Adam Lasnik from Google Webmaster Tools (formerly known as Google Sitemaps) insisted that Google doesn’t punish site owners for duplicate content since most of it is innocently created. But I think he really meant, “We don’t hand out a minus 30 penalty for duplicate pages.” In fact, Google indirectly punishes you by putting duplicate pages in the supplemental results where my experience has always been, they rank poorly.
His comment about duplicate pages being created innocently is a good one. Just think, we have www.mysite.com and mysite.com. And then there are pages that are “print only” versions but are basically dupes of the real thing. We end up with secure and not secure pages for the same page by accident (I’ve seen that twice in the last week.) For most of these problems, the answer was, check your analytics, see which page is the most popular, and then be sure that the search engines don’t try to index the less popular one. You should 301 redirect the less popular version to the more popular version, or use a canonical tag to tell the search engines which one to index.
updated January 2015