Scaling WordPress & WooCommerce For Dragons' Den
How we prepare your website for a major TV screening
We successfully hosted comapny: "The Sweet Beet" during BBC's 'Dragons' Den' screening this year which saw the site receive 133,000 visits over the course of just fifteen minutes. This post explains how we were able to keep their website on-line and able to take orders throughout.
If you're going on live TV, it's a great idea to tell your web host and give them warning. There's a lot of consideration needed to make a website able to function under significant load. If you've watched Dragon's Den before, you might notice usually contestants websites quickly go off-line once they step in front of the Dragons. But, with careful planning and the right technology you can turn this into a big opportunity, avoiding a wasted chance to get your website in front of the masses!
The Sweet Beet is a Texan inspired condiment producer who approached us to prepare for their appearance on Dragons Den. They have a typical Wordpress website, and use Woo-commerce for their ecommerce.
For a technical discussion of the actual configs we employed, refer to our technical artcle on High Availability with HAproxy , Apache2 and MariaDB Galera Clustering.
Scaling Wordpress and Woo-commerce
Scaling refers to how we changed the standard WooCommerce/Wordress set-up to one which is able to manage vasts amounts of visits (~130,00, or 140 every second). Typical Wordpress websites would simply become overwhelmed within seconds, so we were grateful to The Sweet Beat for their forewarning of the TV appearance.
Typically, a Wordpress website is hosted on a single 'virtual' server. This won't do for scalability, as you have no facility to increase capacity once your server is overloaded with visits. We therefore split up the website components onto their own servers.
- How do you prepare your website for 1000's of visits?
- How to scale Wordpress?
- How to scale WooCommerce?
Databases, Webservers, Load Balancer
We put each component of the website onto specalised instances:
- Multiple web servers (we used Apache, a popular web server)
- Multiple database servers (we used MariaDB)
- A load balancer to direct visitors to available web servers (we used HAproxy)
The load balancer was placed as the single point of entry, and directed all visitors amongst the avalable web servers (if one web server becomes overused, we can dynamically direct visitors to a less busy server). If we required more web servers, we could add them live without disrupting existing visitors.
How it worked
The pitch by Lizzy was great fun to watch, and extremely positive from the Dragons. We must admit it was great fun watching the website receive so much traffic in such a short space of time, and we're proud that we stayed up for the whole duration. Over the days that followed we were able to scale down the servers (and costs) as visits cooled off.
- Requests came into our load balancer
- Visitors were directed to available web servers
- The databases (three of them) kept in sync
If any one of the servers were to fail they'd automatically get removed from the pool of available servers (databases or web servers) meaning the availability of the website was good throughout.
What we learned
We knew, but now can appreciate fully: Traffic estimation is hard. Whilst we knew from hearsay roughly how much traffic to expect, every website is different and the whims of Television audiences are impossible to predict. We of course had no idea just how many people would decide to visit the website.
We found that one cluster of three databases was more than enough for our situation, but would have erred on the side of caution to provide more web server instances. All of this required that we balance cost with benefits (it's very easy to throw money at a problem, more servers etc, but actually, this is a startup so we must make a balance).
Overall it was a great success and we're happy to be hosting more high availability sites in the future.
Contact Karma Computing for serious scalability projects & consultancy.