Accelerate your eCommerce ambitions with adeptCommerce Suite

SEO Egghead Consulting Group is a web development firm dedicated to creating custom, search-engine-optimized web site applications.

We specialize in eCommerce and content management web sites that not only render information beautifully to the human, but also satisfy the "third browser" - the search engine. To us, search engines are people too.

image description image description image description image description
Jun 20
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: If you actually read my little bio over on the left up there (you didn't, and I'm not so pompous as to think you care :)), you'd know I'm a white hat SEO who is currently doing some serious SEO consulting for a law firm.  I'd like to share a few things I've noticed in this area, and complain about how horrible the scene really is.Link Spam:Many successful law firm web sites engage in bulk spam links.  By that I don't mean comment-spam.  I mean massive numbers of scraper sites with links pointing to the law firm.  I'm surprised at how well…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 20
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: Since Google has decided that cloaking, in fact, is not against the rules (not really, make sure you click that link), I have decided to post my simple cloaking toolkit written in PHP.  I cannot support this thing, so please don't bombard me with emails.  If you're very proficient in PHP, but never bothered to learn about cloaking, then this is for you.  If you want to report a bug, email me at jsirovic AT gmail dot com.  It's quite likely there are bugs.  I implore you to email Google and let them know how much you appreciate that they have revised their draconian polcies…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 18
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I know there is some consensus on at least the "no linkjuice awarded" aspect of the nofollow attribute between all of the major search engines.  However, there are a few differences, apparently, in other implementation details.  I guess I should be impressed that they embraced the same concept at all, and even used the same syntax to denote it. I've noted the following.  I believe Google was the search engine that first advocated the attribute for the purpose of combating link-spam, and they seem to be the only one that truly follows the jist of the phrase "nofollow."  To Google,…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 18
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I've been digesting this for awhile.  Barry Schwartz of Search Engine Roundtable writes that the New York Times is cloaking content in the interest of  faciliating Google "to access, crawl, index and rank content that would require a username and password by a normal Web user."This may sound OK to most, but I fail to see the fairness in this; and it implies that, like the BMW affair, Google is once-again proving that they provide preferential treatment to large companies.  BMW.de was reincluded in what, a few days?  Good luck, mom and pop, with getting that type of service from…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 15
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: Suppose a webmaster excludes a duplicated page on his site using robots.txt or meta exclusion, but then a user proceeds to link to it anyway.  This is one of the problems with excluding the duplicate content on a site.  More specifically, this is the method I typically use to eliminate the duplicate content as a result of breadcrumb navigation — see this blog entry for more information on that. Redirecting duplicate content is possible in cases where the duplication is more of an error than anything else (URL canonicalization, etc.); in this case, the page is essential to navigation (and…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 14
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I was just thinking that it really bothers me that some SEOs are avowed "white-hats," and don't bother with studying blackhat techniques. Not to do so is a crime of ignorance, and it overlaps with site security as well.  If those honkies don't know about spamming, they won't know the context with regard to, for example, link-condoms and XSS-related security. They also won't know about the backlink exploit in Movable Type blogs. This means they're not fully aware, and can potentially have their sites ambushed by those "evil" blackhatters.So why are people ashamed to put on a black hat, if…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 14
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: I was reading SEO Black Hat during my lunch break, and it pointed me to RSnake's article on using GreaseMonkey to sniff out XSS attack vulnerabilities.  Since I'm a white hat SEO, I'll pretend I'm only interested in this stuff to the extent of attack prevention, so I added a few things to his proof of concept to make it more usable for that purpose (or any purpose, really).First, we create a script that utilizes the last code-snippet I posted here that parses out the response codes from a HTTP document (LinkChecker.php), located here. <?   include('LinkChecker.php');              $header_result = LinkChecker::getHeader($_REQUEST['text']);…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook

Jun 13
image description
Author:
Jaimie Sirovich

Archived; click post to view.
Excerpt: To be honest, I'm not even sure this matters much anymore, but I thought I'd mention it.  Like the issue with parameter ordering (?a=1&b=2 vs. ?b=2&a=1) I mentioned here, a slash at the end of a URL can pose a similar ambiguity problem.  Fortunately, at least for non-rewritten pages, Apache takes care of this issue.  If the resource is a directory, it gets 301-redirected to the-url/, and vice versa. But when mod_rewrite is used, we're not dealing with a file-system, and nothing is done automatically.  Ideally, we'd perform the same thing manually in our scripts and 301-redirect a URL "missing"…

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • del.icio.us
  • StumbleUpon
  • Facebook