Accelerate your eCommerce ambitions with adeptCommerce Suite

SEO Egghead Consulting Group is a web development firm dedicated to creating custom, search-engine-optimized web site applications.

We specialize in eCommerce and content management web sites that not only render information beautifully to the human, but also satisfy the "third browser" - the search engine. To us, search engines are people too.

image description image description image description image description
Nov 18
image description
Author:
Jaimie Sirovich
Tags:

Archived; click post to view.
Excerpt: So you pay your $0.10$1.00 Adwords tithe, someone wanders in your online store, and then Google Related shows anyone with a Google Toolbar installed your competitors' products? Do they return the tithe if they wander out? Heck no — thought not! So if you haven't heard, let me explain what Google Related is. Google Related is a pretty do-no-eviladware-like 'feature' of Google Toolbar that shows competitor products on your web site without your permission or consent. It actually modifies your web site's HTML and injects a bar with lots of competing content (and products) right in at…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Oct 6
image description
Author:
Jaimie Sirovich

I guess Google's adult-filter went on vacation with me this weekend, because there was a buxom blonde baring it all for everyone on the Google News homepage:

 

Barry Schwartz of Search Engine Roundtable says:
"I see it myself, I took a screen capture and blacked out the offensive part."

I don't get it, what does he find offensive? :)

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Sep 28
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: What's more annoying than knowing your site has 1000s of supplemental search results?Not being able to conveniently view them to see what's wrong …  The Google hack mentioned here enumerates the supplemental pages of a domain, and seemed quite useful to this end.  Perhaps Google doesn't want to make our lives too easy, though, because it appears that it no longer works as before.Then I recalled that someone had also created a great tool that reports the number of supplemental results across DCs, and I wondered if his tool broke.  It didn't.  Why?  He simply appends a "-this_is_a_random_string" to the…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Sep 25
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I think paid links are just dandy!  In every other advertising venue, reputation is purchased.  Branding campaigns are ubiquitous.  It does not take long to spot paper advertisements that have no real call to action; rather, they aim to increase a product or brand's "ranking" in the human mind.  This is a what a paid link is to Google.  It is the analog of a branding campaign.  I'm not saying I think link networks are cool.  I'm saying I see nothing wrong with a live link on a prominently placed advertisement.  Google apparently has a problem with that.  Remarkably, Google…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Sep 13
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: This is a compilation of stuff Matt Cutts has said historically, minus some of the more recent stuff here, here, and here.  I decided I'd dig backwards and document some of the older stuff.  I dated it accordingly.  Here it is:1. http://www.mattcutts.com/blog/dashes-vs-underscores/Matt recommends using dashes over underscores to delimit words in urls. 2005.Google does not algorithmically penalize for dashes in the url despite the fact that some have raised it as a possible heuristic for spam detection. I think WordPress pretty much precludes this anyway. 2005.2. http://www.mattcutts.com/blog/seo-mistakes-sneaky-javascript/Google takes action on individual instances of spam when they find it, but they…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Aug 22
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I used to assume that content behind forms was never spidered.  This does not seem to be the case, as one particular form on this blog made me aware.It appears that if Google sees a form consisting only of 1 pulldown (select), it will spider the links created by submitting the form request with the various values in the pulldown.  This has a few implications:1. Google may also spider a form consisting of any control with a finite domain, such as a group of radio buttons.  It could also decide to spider forms with multiple controls having finite domains –…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Aug 22
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I've never assumed that the "Allow:" directive was supported by all search engine spiders.  From what I know, only Google supports it.  The draft mentions it, but that's the problem — it's just a draft.  Officially, the directive does not exist.  Admittedly, it has been in the draft state since 1997!  I guess someone should do something about it, but nobody cares enough to do so. Anyway, Google's robots.txt makes the assumption that all spiders support this draft directive, and uses "Allow:" under "User-agent: *."User-agent: *Allow: /searchhistory/Disallow: /news?output=xhtml&Allow: /news?output=xhtmlDisallow: /search It's not such a big deal, but interesting nonetheless.  I simply…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook


Aug 16
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: We finally have a conclusion on how exactly to interpret a robot.txt file for the edge cases mentioned here.  Someone started a WebmasterWorld thread on the subject of contention.Indeed, according to the specification, the rules for a specific matching user agent entirely override the "User-agent: *" rules.  Therefore, any rule under "User-agent: *" that should also be applied to a specific bot must be repeated under the "User-agent:" for that specific bot.  In other words, the more specific set of directives takes precedence over the default, and only one set is applied. Googleguy says in the thread that he "…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook