Someone commented on my last post that there is a way to achieve some of what I stated as a goal in the previous post without cloaking.  He said:

"Only create the session IDs in the URL when either one already exists, or when the user does something to prompt it (and make sure robots don't do this). For example, if you're making an ecommerce site, don't make the session for the shopping cart when the user enters the site; create it when they add their first item to their basket."

I'll admit a partial defeat here.  His suggestion does mostly work, but it destroys tracking for Cookiephobic users until such time as they hit the shopping cart page.  I've never seen anyone do this in PHP, and there is even some ambiguity in the documentation regarding whether changing the value of "use_trans_sid" at runtime should work (it does, but read the documentation — it makes little sense).  The idea is sane, though, and I implemented what I think he suggested.  Here it is:



if ($_GET[ini_get('')]) {
ini_set ('session.use_trans_sid'1);


<a href='some_page.php'>some page</a>



<a href='another_page.php'>another page</a>

How this works: 

0) All pages have session_start(), but trans_sid is '0' (off) by default in php.ini (it's turned on if conditions below are met).

1) Hardcode sessions.use_trans_sid to '1' (on) on the cart pages.

2) If there is not a cookie set already on the cart pages, then use_trans_sid will rewrite all the URLs to contain the (new, in this case since the sessions were broken up until now as a result of user Cookiephobia) session ID.  If there is a cookie, nothing happens, as use_trans_sid only rewrites URLs and forms if the cookie isn't set.  Because sessions are on on all pages, presumably the user would already have a session cookie from other pages if he has cookies enabled.  If not, the URLs and forms are rewritten.

3) On all pages, if ($_GET[PHPSESSID]) is set, turn on sessions.use_trans_sid, and it will use the url rewriter only if and only if a session ID is already being propogated as a result of going to the cart page.  Thus, if the user was at the cart page at least once, the session IDs will continue to propogate amongst all pages.

4) Exclude the search engines from the cart pages using robots.txt.  Pray they listen, or else they'll enter a gigantic spider trap.

I haven't actually implemented this, and it has its limitations; but it does work in theory.  This lets you turn on sessions for most, but defer it for the Cookiephobics.

Tell an amigo:
  • Sphinn
  • Digg
  • Reddit
  • StumbleUpon
  • Facebook

Related posts:
Cloaking Is Not Evil: Part I This post is an answer to those who have criticized...
Cloaking is Not Evil: Part III Matt Cutts reaffirmed his distaste for cloaking in his most...
ASP.NET 2.0 Setting Dangerous for Google Indexing Authored By: Cristian Darie I'm writing this article to...
The Google Cloaking Hypocrisy I've been digesting this for awhile.  Barry Schwartz of Search...
Google Robots.txt Snafu: Part II I decided that I would test what I think is...