I rely on MagpieRSS to run one of my websites. (If you'd like to see the basic code for the site, see my GitHub profile.)
One of the drawbacks to Magpie, and dynamic websites in general, is they can be bottlenecked by external sources – in the case of Magpie, those sources are the myriad RSS feeds that Datente draws from.
To overcome some of this sluggishness, and to take better advantage of the caching feature of Magpie, I recently started a simple
cron job to load every page on the site every X minutes – this refreshes the cache, and helps ensure reader experience is more performant. By scheduling a background refresh of every page, I cut average page load times by nearly a factor of 10! While this is quite dramatic, my worst-performing page was still taking upwards of 10 seconds to load a not-insignificant percentage of the time (sometimes more than a minute!) 🙁
Enter last week's epiphany – since RSS content doesn't change all that often (even crazy-frequent-updating feeds rarely exceed 4 updates per hour), I could take advantage of a "trick", and change the displayed pages to be nearly static (I still have an Amazon sidebar that's dynamically-loaded) – with this stupidly-simple hack, I cut the slowest page load time from ~10-12 seconds to <1: or another 10x improvement!
"What is the 'trick'," you ask? Simple – I copied every page and prefixed it with a short character sequence, and then modified my
cron job to still run every X minutes, but now call the "build" pages, redirecting the response (which is a web page, of course) into the "display" pages. In other words, make the display pages static by building them in the background every so often.
If you'd like to see exactly how I'm doing this for one page (the rest are similar), check out this stupidly-short shell script:
(time (/bin/curl -f http://datente.com/genindex.php > ~/temp.out)) 2>&1 | grep real
time is in there for my
Compare the run time to the [nearly] static version:
(time (/bin/curl -f http://datente.com/index.php > ~/temp.out)) 2>&1 | grep real