Skip to main content

Non-Work Related Internet Use Causes IT Disruptions

Sergio Galindo

Employers of more than one-third of those surveyed (38.6 percent) had suffered a major IT disruption caused by staff visiting questionable and other non-work related web sites with work-issued hardware, resulting in malware infection and other related issues, according to a study conducted by GFI Software. The study examined various ways that workers use company-provided computers and laptops for personal activities, and the direct impact that personal use can have on the organization.

The study also revealed that more than 35 percent (35.8) of staff would not hesitate to take company property including email archives, confidential documents and other valuable intellectual property from their work-owned computer before returning it, if they were to leave their company.

Furthermore, the study revealed that nearly half of those surveyed (48 percent) use a personal cloud-based file storage solution (e.g. Dropbox, OneDrive, Box) for storing and sharing company data and documents.

Key findings from the survey include:

■ 66.9 percent of respondents use their work-provided computer for non-work activities

■ Overall, 90.9 percent have at least some understanding of their company’s policy on usage, and 94.1 percent follow it to at least some degree

■ More than a quarter (25.6 percent) of those surveyed have had to get their IT department to fix their computer after an issue occurred as a result of innocent non-work use, while almost 6 percent (5.8) had to do the same due to questionable use (adult sites, torrents, etc.)

■ 10 percent have lost data and/or intellectual property as a result of the disruption caused by the outage

This study underscores the fact that data protection is a big problem, and one that has been exacerbated by the casual use of cloud file sharing services that can’t be centrally managed by IT. Content controls are critical in ensuring data does not leak outside the organization and doesn’t expose the business to legal and regulatory compliance penalties. Furthermore, it is important that policies and training lay down clear rules on use and reinforce the ownership of data.

The blind, independent study was conducted for GFI Software by Opinion Matters and surveyed 1,010 U.S. employees from companies with up to 1,000 staff that had a company-provided desktop or laptop computer.

Sergio Galindo is GM Infrastructure Business Unit at GFI Software.

Hot Topics

The Latest

In high-traffic environments, the sheer volume and unpredictable nature of network incidents can quickly overwhelm even the most skilled teams, hindering their ability to react swiftly and effectively, potentially impacting service availability and overall business performance. This is where closed-loop remediation comes into the picture: an IT management concept designed to address the escalating complexity of modern networks ...

In 2025, enterprise workflows are undergoing a seismic shift. Propelled by breakthroughs in generative AI (GenAI), large language models (LLMs), and natural language processing (NLP), a new paradigm is emerging — agentic AI. This technology is not just automating tasks; it's reimagining how organizations make decisions, engage customers, and operate at scale ...

In the early days of the cloud revolution, business leaders perceived cloud services as a means of sidelining IT organizations. IT was too slow, too expensive, or incapable of supporting new technologies. With a team of developers, line of business managers could deploy new applications and services in the cloud. IT has been fighting to retake control ever since. Today, IT is back in the driver's seat, according to new research by Enterprise Management Associates (EMA) ...

In today's fast-paced and increasingly complex network environments, Network Operations Centers (NOCs) are the backbone of ensuring continuous uptime, smooth service delivery, and rapid issue resolution. However, the challenges faced by NOC teams are only growing. In a recent study, 78% state network complexity has grown significantly over the last few years while 84% regularly learn about network issues from users. It is imperative we adopt a new approach to managing today's network experiences ...

Image
Broadcom

From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...

Today, organizations are generating and processing more data than ever before. From training AI models to running complex analytics, massive datasets have become the backbone of innovation. However, as businesses embrace the cloud for its scalability and flexibility, a new challenge arises: managing the soaring costs of storing and processing this data ...

Despite the frustrations, every engineer we spoke with ultimately affirmed the value and power of OpenTelemetry. The "sucks" moments are often the flip side of its greatest strengths ... Part 2 of this blog covers the powerful advantages and breakthroughs — the "OTel Rocks" moments ...

OpenTelemetry (OTel) arrived with a grand promise: a unified, vendor-neutral standard for observability data (traces, metrics, logs) that would free engineers from vendor lock-in and provide deeper insights into complex systems ... No powerful technology comes without its challenges, and OpenTelemetry is no exception. The engineers we spoke with were frank about the friction points they've encountered ...

Enterprises are turning to AI-powered software platforms to make IT management more intelligent and ensure their systems and technology meet business needs for efficiency, lowers costs and innovation, according to new research from Information Services Group ...

The power of Kubernetes lies in its ability to orchestrate containerized applications with unparalleled efficiency. Yet, this power comes at a cost: the dynamic, distributed, and ephemeral nature of its architecture creates a monitoring challenge akin to tracking a constantly shifting, interconnected network of fleeting entities ... Due to the dynamic and complex nature of Kubernetes, monitoring poses a substantial challenge for DevOps and platform engineers. Here are the primary obstacles ...

Non-Work Related Internet Use Causes IT Disruptions

Sergio Galindo

Employers of more than one-third of those surveyed (38.6 percent) had suffered a major IT disruption caused by staff visiting questionable and other non-work related web sites with work-issued hardware, resulting in malware infection and other related issues, according to a study conducted by GFI Software. The study examined various ways that workers use company-provided computers and laptops for personal activities, and the direct impact that personal use can have on the organization.

The study also revealed that more than 35 percent (35.8) of staff would not hesitate to take company property including email archives, confidential documents and other valuable intellectual property from their work-owned computer before returning it, if they were to leave their company.

Furthermore, the study revealed that nearly half of those surveyed (48 percent) use a personal cloud-based file storage solution (e.g. Dropbox, OneDrive, Box) for storing and sharing company data and documents.

Key findings from the survey include:

■ 66.9 percent of respondents use their work-provided computer for non-work activities

■ Overall, 90.9 percent have at least some understanding of their company’s policy on usage, and 94.1 percent follow it to at least some degree

■ More than a quarter (25.6 percent) of those surveyed have had to get their IT department to fix their computer after an issue occurred as a result of innocent non-work use, while almost 6 percent (5.8) had to do the same due to questionable use (adult sites, torrents, etc.)

■ 10 percent have lost data and/or intellectual property as a result of the disruption caused by the outage

This study underscores the fact that data protection is a big problem, and one that has been exacerbated by the casual use of cloud file sharing services that can’t be centrally managed by IT. Content controls are critical in ensuring data does not leak outside the organization and doesn’t expose the business to legal and regulatory compliance penalties. Furthermore, it is important that policies and training lay down clear rules on use and reinforce the ownership of data.

The blind, independent study was conducted for GFI Software by Opinion Matters and surveyed 1,010 U.S. employees from companies with up to 1,000 staff that had a company-provided desktop or laptop computer.

Sergio Galindo is GM Infrastructure Business Unit at GFI Software.

Hot Topics

The Latest

In high-traffic environments, the sheer volume and unpredictable nature of network incidents can quickly overwhelm even the most skilled teams, hindering their ability to react swiftly and effectively, potentially impacting service availability and overall business performance. This is where closed-loop remediation comes into the picture: an IT management concept designed to address the escalating complexity of modern networks ...

In 2025, enterprise workflows are undergoing a seismic shift. Propelled by breakthroughs in generative AI (GenAI), large language models (LLMs), and natural language processing (NLP), a new paradigm is emerging — agentic AI. This technology is not just automating tasks; it's reimagining how organizations make decisions, engage customers, and operate at scale ...

In the early days of the cloud revolution, business leaders perceived cloud services as a means of sidelining IT organizations. IT was too slow, too expensive, or incapable of supporting new technologies. With a team of developers, line of business managers could deploy new applications and services in the cloud. IT has been fighting to retake control ever since. Today, IT is back in the driver's seat, according to new research by Enterprise Management Associates (EMA) ...

In today's fast-paced and increasingly complex network environments, Network Operations Centers (NOCs) are the backbone of ensuring continuous uptime, smooth service delivery, and rapid issue resolution. However, the challenges faced by NOC teams are only growing. In a recent study, 78% state network complexity has grown significantly over the last few years while 84% regularly learn about network issues from users. It is imperative we adopt a new approach to managing today's network experiences ...

Image
Broadcom

From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...

Today, organizations are generating and processing more data than ever before. From training AI models to running complex analytics, massive datasets have become the backbone of innovation. However, as businesses embrace the cloud for its scalability and flexibility, a new challenge arises: managing the soaring costs of storing and processing this data ...

Despite the frustrations, every engineer we spoke with ultimately affirmed the value and power of OpenTelemetry. The "sucks" moments are often the flip side of its greatest strengths ... Part 2 of this blog covers the powerful advantages and breakthroughs — the "OTel Rocks" moments ...

OpenTelemetry (OTel) arrived with a grand promise: a unified, vendor-neutral standard for observability data (traces, metrics, logs) that would free engineers from vendor lock-in and provide deeper insights into complex systems ... No powerful technology comes without its challenges, and OpenTelemetry is no exception. The engineers we spoke with were frank about the friction points they've encountered ...

Enterprises are turning to AI-powered software platforms to make IT management more intelligent and ensure their systems and technology meet business needs for efficiency, lowers costs and innovation, according to new research from Information Services Group ...

The power of Kubernetes lies in its ability to orchestrate containerized applications with unparalleled efficiency. Yet, this power comes at a cost: the dynamic, distributed, and ephemeral nature of its architecture creates a monitoring challenge akin to tracking a constantly shifting, interconnected network of fleeting entities ... Due to the dynamic and complex nature of Kubernetes, monitoring poses a substantial challenge for DevOps and platform engineers. Here are the primary obstacles ...