This year, 2016 is set to host the "battle of the bots" as bot-generated attacks targeting web application infrastructure are increasing in both volume and scope, according to a recent survey conducted by Radware, which profiled the various cybersecurity threats that are expected to increase in the coming year.
One important fact to note is that not all bots are bad. There are plenty of bots and computer-generated traffic programs that are essential for the daily support and maintenance of web applications. Some prominent examples include search engine bots, such as Baidu Spider and Bingbot. While these bots exist to support the infrastructure, IT managers do need to be aware of the bad bots out there, as they are also numerous, and can pose a serious threat to web application performance.
These bad bots generate various web attacks, some of the most common being SQL injections and Cross-Site Request Forgery, web scraping, and, of course, the ever-looming threat of DDoS attacks.
Every web administrator knows the fear – application performance slowing to a crawl, and then crashing entirely, all because of a massive, unforeseen influx of web traffic from a bot-network. Web applications can’t handle that amount of traffic, and performance suffers.
Since humans can be just as great of a threat to web applications as bots, it’s vital for organizations to be able to distinguish between human and bot activity, in order to properly mitigate threats. One common form of detection is the use of CAPTCHA challenges, a reverse Turing test used to gauge the ability of a computer program to mimic human behavior. However, while this practice is an acceptable means to detect simple, script-based bots, the rise of "advanced bots" has posed a challenge to the IT industry.
These newer, more sophisticated bots are based on headless browser technology and pose significant complications to the detection process. Advanced bots are capable of mimicking human user behavior to a much higher degree than their script-based counterparts and use techniques such as running Javascript and following links graphically to trick detection protocols into thinking they are performing are human activities. These bots are also capable of passing CAPTCHA challenges and setting up dynamic IP addresses, which allows them to maintain low rates of activity per individual IP on a bot network, thus evading IP-based detection parameters.
Defending Against the Bots
So how can organizations defend themselves against such sophisticated bots?
The first step is to assure the use of IP-agnostic bot detection, as successful detection requires correlation across sessions. Without this correlation, it can be highly challenging to detect advanced bots jumping from IP to IP. Relying solely on IP-based detection is not sufficient and can conceal larger threats. To create this IP-agnostic system, fingerprinting is required.
The use of device fingerprinting offers IT managers the ability to identify browsers or automated web client tools through data collection. These tools are able to collect information in various forms, such as operating system specifications, TCP/IP configuration, underlying hardware attributes, and browser attributes. Commonly, this data is collected through Javascript processing, although some types, like TCP/IP, can be collected passively without obvious querying.
A great deal of client-side browser attributes can be collected to form a device fingerprint. While some attributes may seem common, the consolidation and combination of this information is what yields power and sufficiently distinct device fingerprints.
As attacks by advanced bots become increasingly common, the maintenance of an IP-agnostic detection environment is becoming more critical, as is the ability to track bots jumping across IPs via a single, consistent fingerprint.
Finally, it’s important to gauge the threat to applications across multiple attack vectors. An application DDoS attack may be targeting specific resources, however a data-focused scraping attack is typically aimed at specific web pages with the goal of information extraction. Be sure to apply device fingerprinting where it makes the most sense, whether that be a single point of interest within an application or the global implementation across domain resources.
Kent Alstad is VP of Acceleration at Radware.
The Latest
From the accelerating adoption of artificial intelligence (AI) and generative AI (GenAI) to the ongoing challenges of cost optimization and security, these IT leaders are navigating a complex and rapidly evolving landscape. Here's what you should know about the top priorities shaping the year ahead ...
In the heat of the holiday online shopping rush, retailers face persistent challenges such as increased web traffic or cyber threats that can lead to high-impact outages. With profit margins under high pressure, retailers are prioritizing strategic investments to help drive business value while improving the customer experience ...
In a fast-paced industry where customer service is a priority, the opportunity to use AI to personalize products and services, revolutionize delivery channels, and effectively manage peaks in demand such as Black Friday and Cyber Monday are vast. By leveraging AI to streamline demand forecasting, optimize inventory, personalize customer interactions, and adjust pricing, retailers can have a better handle on these stress points, and deliver a seamless digital experience ...
Broad proliferation of cloud infrastructure combined with continued support for remote workers is driving increased complexity and visibility challenges for network operations teams, according to new research conducted by Dimensional Research and sponsored by Broadcom ...
New research from ServiceNow and ThoughtLab reveals that less than 30% of banks feel their transformation efforts are meeting evolving customer digital needs. Additionally, 52% say they must revamp their strategy to counter competition from outside the sector. Adapting to these challenges isn't just about staying competitive — it's about staying in business ...
Leaders in the financial services sector are bullish on AI, with 95% of business and IT decision makers saying that AI is a top C-Suite priority, and 96% of respondents believing it provides their business a competitive advantage, according to Riverbed's Global AI and Digital Experience Survey ...
SLOs have long been a staple for DevOps teams to monitor the health of their applications and infrastructure ... Now, as digital trends have shifted, more and more teams are looking to adapt this model for the mobile environment. This, however, is not without its challenges ...
Modernizing IT infrastructure has become essential for organizations striving to remain competitive. This modernization extends beyond merely upgrading hardware or software; it involves strategically leveraging new technologies like AI and cloud computing to enhance operational efficiency, increase data accessibility, and improve the end-user experience ...
AI sure grew fast in popularity, but are AI apps any good? ... If companies are going to keep integrating AI applications into their tech stack at the rate they are, then they need to be aware of AI's limitations. More importantly, they need to evolve their testing regiment ...
If you were lucky, you found out about the massive CrowdStrike/Microsoft outage last July by reading about it over coffee. Those less fortunate were awoken hours earlier by frantic calls from work ... Whether you were directly affected or not, there's an important lesson: all organizations should be conducting in-depth reviews of testing and change management ...