Skip to main content

Bots Are Getting More Sophisticated, It's Time Your Cyber Defenses Do Too

Kent Alstad

This year, 2016 is set to host the "battle of the bots" as bot-generated attacks targeting web application infrastructure are increasing in both volume and scope, according to a recent survey conducted by Radware, which profiled the various cybersecurity threats that are expected to increase in the coming year.

One important fact to note is that not all bots are bad. There are plenty of bots and computer-generated traffic programs that are essential for the daily support and maintenance of web applications. Some prominent examples include search engine bots, such as Baidu Spider and Bingbot. While these bots exist to support the infrastructure, IT managers do need to be aware of the bad bots out there, as they are also numerous, and can pose a serious threat to web application performance.

These bad bots generate various web attacks, some of the most common being SQL injections and Cross-Site Request Forgery, web scraping, and, of course, the ever-looming threat of DDoS attacks.

Every web administrator knows the fear – application performance slowing to a crawl, and then crashing entirely, all because of a massive, unforeseen influx of web traffic from a bot-network. Web applications can’t handle that amount of traffic, and performance suffers.

Since humans can be just as great of a threat to web applications as bots, it’s vital for organizations to be able to distinguish between human and bot activity, in order to properly mitigate threats. One common form of detection is the use of CAPTCHA challenges, a reverse Turing test used to gauge the ability of a computer program to mimic human behavior. However, while this practice is an acceptable means to detect simple, script-based bots, the rise of "advanced bots" has posed a challenge to the IT industry.

These newer, more sophisticated bots are based on headless browser technology and pose significant complications to the detection process. Advanced bots are capable of mimicking human user behavior to a much higher degree than their script-based counterparts and use techniques such as running Javascript and following links graphically to trick detection protocols into thinking they are performing are human activities. These bots are also capable of passing CAPTCHA challenges and setting up dynamic IP addresses, which allows them to maintain low rates of activity per individual IP on a bot network, thus evading IP-based detection parameters.

Defending Against the Bots

So how can organizations defend themselves against such sophisticated bots?

The first step is to assure the use of IP-agnostic bot detection, as successful detection requires correlation across sessions. Without this correlation, it can be highly challenging to detect advanced bots jumping from IP to IP. Relying solely on IP-based detection is not sufficient and can conceal larger threats. To create this IP-agnostic system, fingerprinting is required.

The use of device fingerprinting offers IT managers the ability to identify browsers or automated web client tools through data collection. These tools are able to collect information in various forms, such as operating system specifications, TCP/IP configuration, underlying hardware attributes, and browser attributes. Commonly, this data is collected through Javascript processing, although some types, like TCP/IP, can be collected passively without obvious querying.

A great deal of client-side browser attributes can be collected to form a device fingerprint. While some attributes may seem common, the consolidation and combination of this information is what yields power and sufficiently distinct device fingerprints.

As attacks by advanced bots become increasingly common, the maintenance of an IP-agnostic detection environment is becoming more critical, as is the ability to track bots jumping across IPs via a single, consistent fingerprint.

Finally, it’s important to gauge the threat to applications across multiple attack vectors. An application DDoS attack may be targeting specific resources, however a data-focused scraping attack is typically aimed at specific web pages with the goal of information extraction. Be sure to apply device fingerprinting where it makes the most sense, whether that be a single point of interest within an application or the global implementation across domain resources.

Kent Alstad is VP of Acceleration at Radware.

The Latest

According to Auvik's 2025 IT Trends Report, 60% of IT professionals feel at least moderately burned out on the job, with 43% stating that their workload is contributing to work stress. At the same time, many IT professionals are naming AI and machine learning as key areas they'd most like to upskill ...

Businesses that face downtime or outages risk financial and reputational damage, as well as reducing partner, shareholder, and customer trust. One of the major challenges that enterprises face is implementing a robust business continuity plan. What's the solution? The answer may lie in disaster recovery tactics such as truly immutable storage and regular disaster recovery testing ...

IT spending is expected to jump nearly 10% in 2025, and organizations are now facing pressure to manage costs without slowing down critical functions like observability. To meet the challenge, leaders are turning to smarter, more cost effective business strategies. Enter stage right: OpenTelemetry, the missing piece of the puzzle that is no longer just an option but rather a strategic advantage ...

Amidst the threat of cyberhacks and data breaches, companies install several security measures to keep their business safely afloat. These measures aim to protect businesses, employees, and crucial data. Yet, employees perceive them as burdensome. Frustrated with complex logins, slow access, and constant security checks, workers decide to completely bypass all security set-ups ...

Image
Cloudbrink's Personal SASE services provide last-mile acceleration and reduction in latency

In MEAN TIME TO INSIGHT Episode 13, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud networking strategy ... 

In high-traffic environments, the sheer volume and unpredictable nature of network incidents can quickly overwhelm even the most skilled teams, hindering their ability to react swiftly and effectively, potentially impacting service availability and overall business performance. This is where closed-loop remediation comes into the picture: an IT management concept designed to address the escalating complexity of modern networks ...

In 2025, enterprise workflows are undergoing a seismic shift. Propelled by breakthroughs in generative AI (GenAI), large language models (LLMs), and natural language processing (NLP), a new paradigm is emerging — agentic AI. This technology is not just automating tasks; it's reimagining how organizations make decisions, engage customers, and operate at scale ...

In the early days of the cloud revolution, business leaders perceived cloud services as a means of sidelining IT organizations. IT was too slow, too expensive, or incapable of supporting new technologies. With a team of developers, line of business managers could deploy new applications and services in the cloud. IT has been fighting to retake control ever since. Today, IT is back in the driver's seat, according to new research by Enterprise Management Associates (EMA) ...

In today's fast-paced and increasingly complex network environments, Network Operations Centers (NOCs) are the backbone of ensuring continuous uptime, smooth service delivery, and rapid issue resolution. However, the challenges faced by NOC teams are only growing. In a recent study, 78% state network complexity has grown significantly over the last few years while 84% regularly learn about network issues from users. It is imperative we adopt a new approach to managing today's network experiences ...

Image
Broadcom

From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...

Bots Are Getting More Sophisticated, It's Time Your Cyber Defenses Do Too

Kent Alstad

This year, 2016 is set to host the "battle of the bots" as bot-generated attacks targeting web application infrastructure are increasing in both volume and scope, according to a recent survey conducted by Radware, which profiled the various cybersecurity threats that are expected to increase in the coming year.

One important fact to note is that not all bots are bad. There are plenty of bots and computer-generated traffic programs that are essential for the daily support and maintenance of web applications. Some prominent examples include search engine bots, such as Baidu Spider and Bingbot. While these bots exist to support the infrastructure, IT managers do need to be aware of the bad bots out there, as they are also numerous, and can pose a serious threat to web application performance.

These bad bots generate various web attacks, some of the most common being SQL injections and Cross-Site Request Forgery, web scraping, and, of course, the ever-looming threat of DDoS attacks.

Every web administrator knows the fear – application performance slowing to a crawl, and then crashing entirely, all because of a massive, unforeseen influx of web traffic from a bot-network. Web applications can’t handle that amount of traffic, and performance suffers.

Since humans can be just as great of a threat to web applications as bots, it’s vital for organizations to be able to distinguish between human and bot activity, in order to properly mitigate threats. One common form of detection is the use of CAPTCHA challenges, a reverse Turing test used to gauge the ability of a computer program to mimic human behavior. However, while this practice is an acceptable means to detect simple, script-based bots, the rise of "advanced bots" has posed a challenge to the IT industry.

These newer, more sophisticated bots are based on headless browser technology and pose significant complications to the detection process. Advanced bots are capable of mimicking human user behavior to a much higher degree than their script-based counterparts and use techniques such as running Javascript and following links graphically to trick detection protocols into thinking they are performing are human activities. These bots are also capable of passing CAPTCHA challenges and setting up dynamic IP addresses, which allows them to maintain low rates of activity per individual IP on a bot network, thus evading IP-based detection parameters.

Defending Against the Bots

So how can organizations defend themselves against such sophisticated bots?

The first step is to assure the use of IP-agnostic bot detection, as successful detection requires correlation across sessions. Without this correlation, it can be highly challenging to detect advanced bots jumping from IP to IP. Relying solely on IP-based detection is not sufficient and can conceal larger threats. To create this IP-agnostic system, fingerprinting is required.

The use of device fingerprinting offers IT managers the ability to identify browsers or automated web client tools through data collection. These tools are able to collect information in various forms, such as operating system specifications, TCP/IP configuration, underlying hardware attributes, and browser attributes. Commonly, this data is collected through Javascript processing, although some types, like TCP/IP, can be collected passively without obvious querying.

A great deal of client-side browser attributes can be collected to form a device fingerprint. While some attributes may seem common, the consolidation and combination of this information is what yields power and sufficiently distinct device fingerprints.

As attacks by advanced bots become increasingly common, the maintenance of an IP-agnostic detection environment is becoming more critical, as is the ability to track bots jumping across IPs via a single, consistent fingerprint.

Finally, it’s important to gauge the threat to applications across multiple attack vectors. An application DDoS attack may be targeting specific resources, however a data-focused scraping attack is typically aimed at specific web pages with the goal of information extraction. Be sure to apply device fingerprinting where it makes the most sense, whether that be a single point of interest within an application or the global implementation across domain resources.

Kent Alstad is VP of Acceleration at Radware.

The Latest

According to Auvik's 2025 IT Trends Report, 60% of IT professionals feel at least moderately burned out on the job, with 43% stating that their workload is contributing to work stress. At the same time, many IT professionals are naming AI and machine learning as key areas they'd most like to upskill ...

Businesses that face downtime or outages risk financial and reputational damage, as well as reducing partner, shareholder, and customer trust. One of the major challenges that enterprises face is implementing a robust business continuity plan. What's the solution? The answer may lie in disaster recovery tactics such as truly immutable storage and regular disaster recovery testing ...

IT spending is expected to jump nearly 10% in 2025, and organizations are now facing pressure to manage costs without slowing down critical functions like observability. To meet the challenge, leaders are turning to smarter, more cost effective business strategies. Enter stage right: OpenTelemetry, the missing piece of the puzzle that is no longer just an option but rather a strategic advantage ...

Amidst the threat of cyberhacks and data breaches, companies install several security measures to keep their business safely afloat. These measures aim to protect businesses, employees, and crucial data. Yet, employees perceive them as burdensome. Frustrated with complex logins, slow access, and constant security checks, workers decide to completely bypass all security set-ups ...

Image
Cloudbrink's Personal SASE services provide last-mile acceleration and reduction in latency

In MEAN TIME TO INSIGHT Episode 13, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud networking strategy ... 

In high-traffic environments, the sheer volume and unpredictable nature of network incidents can quickly overwhelm even the most skilled teams, hindering their ability to react swiftly and effectively, potentially impacting service availability and overall business performance. This is where closed-loop remediation comes into the picture: an IT management concept designed to address the escalating complexity of modern networks ...

In 2025, enterprise workflows are undergoing a seismic shift. Propelled by breakthroughs in generative AI (GenAI), large language models (LLMs), and natural language processing (NLP), a new paradigm is emerging — agentic AI. This technology is not just automating tasks; it's reimagining how organizations make decisions, engage customers, and operate at scale ...

In the early days of the cloud revolution, business leaders perceived cloud services as a means of sidelining IT organizations. IT was too slow, too expensive, or incapable of supporting new technologies. With a team of developers, line of business managers could deploy new applications and services in the cloud. IT has been fighting to retake control ever since. Today, IT is back in the driver's seat, according to new research by Enterprise Management Associates (EMA) ...

In today's fast-paced and increasingly complex network environments, Network Operations Centers (NOCs) are the backbone of ensuring continuous uptime, smooth service delivery, and rapid issue resolution. However, the challenges faced by NOC teams are only growing. In a recent study, 78% state network complexity has grown significantly over the last few years while 84% regularly learn about network issues from users. It is imperative we adopt a new approach to managing today's network experiences ...

Image
Broadcom

From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...