- Unsupervised Learning
- Posts
- Google Pagespeed Not a Good Choice For High-traffic Sites
Google Pagespeed Not a Good Choice For High-traffic Sites
So I have just finished optimizing my web stack to the nth degree.
Host, web server, application server, database, caching, DNS, CDN—everything.
One of the components I used was Google’s Pagespeed, which comes in two forms. Here I’m speaking of the local version, i.e. the package you compile into Nginx and then configure. From there it optimizes content for delivery to your users.
And it did that. It was great about image optimization, and specifically with minifying (which is something that Nginx doesn’t do natively).
So I thought everything was great.
That was, until I threw some traffic at it.
I use Blitz.io to test server load, I expected to get even faster once I had Pagespeed enabled.
Unfortunately it was the opposite: It got melted.
I hit my box with a 60 second run of 100-500 users (like being on the front page of Reddit or Hacker News), and I was stunned at the results.
Unsupervised Learning — Security, Tech, and AI in 10 minutes…
Get a weekly breakdown of what's happening in security and tech—and why it matters.
With Pagespeed enabled, the box choked to death. My CPU spiked. I got mass timeouts. I got mass page erros.
I wanted to see what was wrong, so I started disabling things one by one, starting with Pagespeed.
Well, running just Nginx (with my highly tweaked native config) plus my Fast-CGI caching coming out of tempfs (memory), I served pages faster than on Nginx AND my server completely laughed at the same incoming traffic.
Zero errors. Zero timeouts. And lower response times.
Bottom line: Pagespeed is a great project for cutting edge optimization of content, but if you plan on taking any significant amount of traffic it probably isn’t a good choice.
Lesson learned.