By way of this post on Search Engine Roundtable, I found out that Yahoo! now supports wildcard matching for rules in Robots.txt.  This is great news. Why?  It's now de facto part of the standard — as the "big three" finally all support it!

This makes wildcard matching much more useful, as my exclusions usually apply to all search engines.  What is its utility if one of the "big three" doesn't honor such a rule?  In that case I have historically resorted to using meta tag exclusion.  Now, as long as I don't care about the second-tier search engines, I'm less inclined to do so.

I'd expect Ask.com to quickly follow suit.  This rule would work in Google, MSN, and now, Yahoo:

User-Agent: *
Disallow: /*.gif$

So I suppose Ask.com may choke for now, and that may matter.  But nothing else really matters at all in US markets. Yahoo! has more information over here.

PS: Hello from Israel!  I'll post some pictures sometime this week :)

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook



Related posts:
Google Robots.txt Snafu: Part II I decided that I would test what I think is...
Google Robots.txt Snafu (Update) Some people may know about this already, but it's worth...
Google Robots.txt Snafu: Part III (Conclusion) We finally have a conclusion on how exactly to interpret...
Google's Borked Robots.txt I've never assumed that the "Allow:" directive was supported by...
CSS Spam and Robots.txt What really stops anyone from using a CSS-based layout, throwing...