Accelerate your eCommerce ambitions with adeptCommerce Suite

SEO Egghead Consulting Group is a web development firm dedicated to creating custom, search-engine-optimized web site applications.

We specialize in eCommerce and content management web sites that not only render information beautifully to the human, but also satisfy the "third browser" - the search engine. To us, search engines are people too.

image description image description image description image description
Jul 24
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I said awhile back in my post, Google Violates Computer Science that people have too much faith in Google.  I said that even light obfuscation of Javascript redirect code, such as rot13ing the offending code would likely trick even the formidable Google.  I may have changed my mind.  This stuff may work near-term, but I have my doubts as to the future.Interestingly enough, Microsoft is experimenting with a new technique whereby they target certain areas known to be spam "paradises," and use actual redirection as an indication of spam.  The project is called Strider Search Defender.  To implement this, they actually…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 23
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: Should the need exist to change hosting companies, the process must be completed in the proper order.  Not doing so may result in a time window where your site is unreachable; and this is clearly not desirable — from both a general and SEO perspective.  The focus of this elaborate process is to prevent both users and search engines from perceiving that the site is gone — or in the case of virtual hosting, seeing the wrong site.  Search engines do have heuristics for recognizing that these problems exist, but it is better not to rely on this.  Note: Virtual…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 21
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: This class can be used to detect where a site visitor is located and tailor their page content accordingly.  For example, you may want to say "We surrender!" for French users, instead of the usual "Hello."  No offense to Frenchies of course; but I had to poke some fun to make this post less dry :)To do this, we need to include this class in our application; the code below the class definition implements the aforementioned example.  Many thanks to MaxMind for providing their free database; additionally, they have a better, more accurate version for a fee.  Keep in mind…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 21
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: This function, comprised of a simple regular expression, will remove most of the bloat from larger CSS files.  Not that the effects are very substatial or groundbreaking, but it does save quite a few kilobytes to run things like this over your CSS and HTML.  Using mod_gzip also has a favorable effect, but this cannot hurt either.  If you find any real bugs, let me know; if it's something completely pathological and contrived, don't let me know.  I have another filter for HTML; but I'll post it another day.  Here it is: <? function trimCSS($str) {…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 21
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I decided that I would test what I think is an inconsistency in the interpretation of the robots.txt specification by various implementors cited here.I created a robots.txt file for this site that is contrived to test how various spiders interpret the specification.  Here it is:User-agent: *Disallow: /blog/seo/automatically-highlighting-internal-links-p51.htmlDisallow: /blog/seo/msn-search-p5.htmlUser-agent: googlebotDisallow: /blog/seo/msn-search-p5.htmlUser-agent: msnbotDisallow: /blog/seo/using-referers-http_referer-to-increase-conversions-and-perceived-relevance-p9.htmlUser-agent: slurpDisallow: /blog/seo/yahoo-hostings-lack-of-htaccess-support-p8.htmlSince I'd like to make sure all these pages are indexed well in the first place (and because I'm a link-whore), please link to this post for me :)We already know that Google will only exclude the page on MSN search, but I'm curious how Yahoo and…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 21
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: Microsoft applications have this nasty habit of exchanging both your single and double quotes with "smarter" versions.  They curve inwards and look really snazzy in Microsoft Word.  When you cut and paste them, they're unencoded, as Windows assumes that everyone is using windows-1252 or something.  Unfortunately, that's pretty annoying if you're not using a Windows character set.  So you may want to alter them using the regular expressions that follow. ROTD stands for "regex of the day," in case you're wondering.  And I was going for clarity here — not efficiency, so don't point out that this could be written…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 20
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: Some people may know about this already, but it's worth discussing since it has been pertinent to me a few times:In theory, according to my interpretation of the robots.txt specification, if a Disallow: under User-agent: "*" exists, as well a Disallow: under a specific robot's User-agent:, and that robot accesses the web site, both rules should be applied, and both should be excluded.  However, Google does not interpret it this way, and only applies the rules for the specific robot User-agent:, "googlebot."  For Google, it is necessary to repeat all rules in "*" under googlebot’s User-agent: as well to get…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jul 20
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: Someone commented on my last post that there is a way to achieve some of what I stated as a goal in the previous post without cloaking.  He said:"Only create the session IDs in the URL when either one already exists, or when the user does something to prompt it (and make sure robots don't do this). For example, if you're making an ecommerce site, don't make the session for the shopping cart when the user enters the site; create it when they add their first item to their basket."I'll admit a partial defeat here.  His suggestion does mostly work,…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook