- Feb. 13th, 2007
- 14 comments
A few weeks ago (before I got sick), Wikipedia announced that they are "nofollowing" all links contained by articles — effectively telling search engines the links are untrusted and should not be factored as heavily into a ranking algorithm.
I think this is patently ridiculous!
If wikipedia tells the world its links are not trustworthy of a vote, what does it say about the reliability of their content as a whole?
Isn't the point of collaborative editing that, by and large, the product of such an aggregation of edits — some good, some bad, will collectively achieve something high in quality?
Rand Fishkin asserts that it's a good choice. I respectfully disagree. Comment spam has not been stopped by the nofollow attribute. Most spammers do it regardless just to see what sticks, and to get something indexed in the first place. I believe that this policy, a knee-jerk reaction to a few silly SEO contests, is actually harmful.
It removes a series of high quality votes that power the link-equity based ranking algorithms of modern search engines, as well as broadens the definition of what nofollow really is — an already hazy definition, cheapening its purpose entirely.
It is my belief that nofollow should only be applied to those links that are unedited or perhaps glanced at — like forums and blog comments. Wikipedia is mercilessly edited. It does not fit this profile at all.
Perhaps they can nofollow all links that are younger than 90 days to give editors some time to nuke them and remove some of the motivation in that the gains are less immediate. That would make sense, achieve just as much (whatever extent that is), and its entirely doable with some clever programming. What do you think?
"14 Wise Comments Banged Out Somewhere On The Internet ..."