Crawl Rate Tracker has been in my WordPress toolbox for a while now. It’s one of those plugins that gets installed on every WordPress site I put up, and, despite it being written for WordPress 2.5, it still works.
Recently, however, a client wanted a multisite installation. After the initial setup, I put in the usual plugins, including Crawl Rate Tracker.
Unfortunately, Crawl Rate Tracker does not work on multisite. Sure, it will work fine for normal visitors, but it causes major errors for bots. Install and activate the plugin then change your user agent to the googlebot (or something similar) and visit the site; you’ll get a white screen of death that just says a table can’t be found. Crawl Rate Tracker doesn’t add tables on a per blog basis, instead it only adds one. The result is a MySQL error because WordPress can’t find the table for the crawl rate tracker logs. This only happens for bots because the Crawl Rate Tracker checks the
$_SERVER['HTTP_USER_AGENT'] for google, yahoo, or technorati bots and logs something only if one of the three is found.
Why We’re Not Using Crawl Rate Tracker Any More
After poking around the Crawl Rate Tracker code a bit, I can safely say that it’s a plugin PMG will no longer be using. It does work on single WordPress installs, but the code is old. It makes use of many deprecated WP functions and doesn’t take advantage of any of the WordPress data validation functions.
Perhaps an updated Crawl Rate Tracker from PMG is in the future?