Optimizing session handling sites for Bots

If your site handles sessions and undergoes PCI compliance testing or is frequently indexed by Search Engine bots, a really quick optimization is to limit the session limit based on the User Agent.

In ColdFusion, you can do something like:
if (REFindNoCase('Slurp|Googlebot|BecomeBot|bingbot|Mediapartners-Google|ZyBorg|RufusBot|EMonitor', cgi.http_user_agent)) {
this.SessionTimeout = CreateTimeSpan(0, 0, 0, 10);
}
else{
this.SessionTimeout = CreateTimeSpan(0, 1, 0, 0);
}

which obviously limits the session length to 10 seconds for bots and 1 hour for normal site visitors. This prevents server memory from needlessly being consumed by each of the bot sessions (assuming they dont store cookies).

PS. If the bots are capable of handling cookies, this could create issues since the session wont be maintained for more than 10 seconds – so be sure to test before making changes. For additional clarification, since you are not being deceptive and presenting different content to users vs. bots, you won’t be penalized for this.

Popularity: 4% [?]

Comments are closed.