- Jun. 15th, 2006
- 2 comments
Suppose a webmaster excludes a duplicated page on his site using robots.txt or meta exclusion, but then a user proceeds to link to it anyway. This is one of the problems with excluding the duplicate content on a site. More specifically, this is the method I typically use to eliminate the duplicate content as a result of breadcrumb navigation — see this blog entry for more information on that.
Redirecting duplicate content is possible in cases where the duplication is more of an error than anything else (URL canonicalization, etc.); in this case, the page is essential to navigation (and thus cannot be 301-redirected), but also should not be indexed because it is, in fact, a duplicate. Ideally, people would only link to the version that is not excluded. But how should the user know which version this is, and why should he care anyway?
Supposing someone does link to it, though, does it still count as vote for your site as a whole at all? It would make sense to me, but I'm not sure. Anyone care to comment?
"2 Wise Comments Banged Out Somewhere On The Internet ..."