Fyre Festival
Working in partnership with Matte, we helped to create the website for Fyre Festival. Built responsively, with a third party ticketing integration, the website was built to withstand the millions of hits generated by influencer and media promotion.

Aye we know, we know.
We debated publishing this case study for a while (two years actually). But the truth is – partners involved were sold on the FYRE story as much as investors and festival-goers. In the end, whether you were eating soggy Kraft cheese butties on an island/ in the rain – or writing off a load of invoices for work completed – everybody lost out, including your humble narrators.
We may or may not take this back down at some point, but for now we’ll just focus on our technical solution; a bit about how we built the site and the way the infrastructure was designed to easily shrug-off millions of hits in one day once the celebrity influencers and global media sent traffic our way.
If you’ve been keeping your head down for a few years and need any more info on the festival itself, you can watch the Netflix doc here.
We were told to expect more than a million visitors an hour following launch. A lot of clients wildly over-estimate this kind of thing, but FYRE did at least deliver on this promise.
This level of traffic requires some proper technical thinking and firepower, so we decided on a solution using Amazon Web Services. One thing in our favour was the site was largely static in nature, meaning our first port of call would be as much caching away from the core platform itself. We utilised CloudFront to cache content at edge locations all around the world meaning low latency (i.e. snappy load times). One thing NOT in our favour was the fact that – due to the profile of the festival (and the clientele) – only HD, full screen video would do.
Caching to all of these locations needed a robust underlying solution which was hosted in the AWS US-EAST-1 region where Elastic Load Balancers shared the load across multiple web server EC2 instances, which in turn pulled database content from a Multi-AZ database cluster. The EC2 instances auto-scaled, so as load increased, so did the number of web servers. Images were served to the CloudFront CDN via a rule-set within CloudFront pulling them from Amazon’s S3 highly durable, mission critical data storage.
Ticket processing and payments were handled by a third-party platform integrated via an API, but content still needed to be updated to reflect availability (and other things the promoters decided to add to the site at certain points) requiring cache-purging.
Following launch, with influencers and global media driving traffic, the FYRE site acquired more than 6 million visitors in the first 3 hours, with zero downtime.
Glasto, Coachella, LFC Ticketing Department etc - get in touch!







