The Challenge: Bot Attacks Straining Web Performance & Resources
As a popular online retailer, Coop's website became a target for bot attacks. First, scraping bots were relentless, placing a heavy load on their servers. This led to significantly slower page loading times, impacting the user experience of genuine customers. Moreover, Google penalizes slow websites, which impacted Coop's SEO rankings, reducing their visibility in search results.
Tobias Schläpfer, one of Coop's Backend Developer for web applications, had to tackle the problem head-on. "We tried multiple low-level protections, such as a WAF and reCAPTCHA, but it became clear that traditional solutions were no longer enough," Tobias shares.
In addition to the scraping, Tobias and his team discovered that bots exploited some of the website features, such as "find in stock" and "find a store", which relied on the Google API. Its excessive use overloaded the API, resulting in extra operational costs of $5,000-$10,000 a month. Legitimate users often found the service unavailable, leading to a poor user experience.
"Our IT teams were burdened with the manual task of analyzing traffic to identify and block bad IP addresses," Tobias recalls, "which was time-consuming and inefficient, as blocking an IP only provided temporary relief before bots would reappear using new addresses."
All these problems hampered Coop's performance, prompting the company to look for a robust solution capable of effectively managing and mitigating the impact of malicious bots.