“Did you read about the 300Gbs DDoS attack last week? We must test our protection against an attack of that size.”
Indeed, DDoS simulation testing, which uses a controlled DDoS attack, is the best method to evaluate whether you are prepared to defend against a distributed denial of service attack.
While most organizations are confident they're protected, simulation testing almost always uncovers a gap between perceived and actual protection levels. The first time we run a DDoS attack simulation for our customers, they are almost always surprised by the results.
Nevertheless, DDoS simulation testing at very high traffic volumes is not necessary. You can effectively validate your protection with much smaller attack simulations. Here are the reasons why.
It’s Not Just About Gbps. There’s Also PPS and RPS
Gbps is used to measure volumetric attacks, also known as floods, which are straightforward attempts to overwhelm a website’s bandwidth with as much traffic as the attacker can generate. As these are the most well known of DDoS attacks, many enterprise customers focus all their attention on Gbps.
However, protocol attacks (on layers 3 and 4), which are no less destructive, are measured differently – in packets per second (PPS). PPS measures the intensity of requests flooding network-level protocols or devices. A SYN flood is one such common protocol attack measured in PPS.
Then there are application-level attacks, which are measured in requests per second (RPS) or connections per second, indicating the number of requests sent to a single online application. In this case, the attack is highly focused and intended to overwhelm the CPU and memory upon which the app is dependent.
In our DDoS testing, we methodically use each of the three kinds of attack. We often discover that while a company can protect against volumetric attacks, it fails to detect or mitigate the other types.
Your Protection Architecture and Configuration are the Most Critical
The impact of less-than-optimal configurations in your DDoS protection tools show up right away in lower volume attack simulations. Inappropriate rate limit threshold settings, web challenges, bot protection, geo-protection, and the like are not dependent on the size of the DDoS attack – but they are absolutely critical to effective mitigation.
In the thousands of DDoS tests we’ve run, we’ve encountered such architecture and configuration problems again and again – and they are regularly detected in 10 Gbps attack simulations.
In fact, there are only a few kinds of companies that require large-scale volumetric testing: ISPs, cloud computing providers and DDoS mitigation vendors. These companies rely on their own protection equipment to mitigate massive flood attacks, so they must extend testing to cover volumetric attacks at the scale of 300 Gbps or above.
In short, if your organization’s mitigation measures are effective against 10 Gbps attacks, then rest assured they can also handle 200 and 300 Gbps attacks of the same kind.