
SOASTA is now delivering performance insights for digital businesses with the addition of Third-Party Resource Analytics, Conversion Impact and Activity Impact scores, Session Path Analysis, and Predictive Analytics from SOASTA’s Data Science Workbench (DSWB) platform.
Brands can now improve revenue and deliver great customer experiences by improving their digital performance in real time with precise insights based on their real user data. With this release, SOASTA is evolving how companies view their user events, enabling them to make the best decisions for their digital business. SOASTA‘s data-driven approach creates a baseline for Digital Performance Management; out of this data, companies can more effectively manage their digital assets.
Built on data from SOASTA mPulse, which collects all user performance data from web assets, DSWB allows enterprises to:
- Understand exactly how many third-party resources make up their pages
- Identify the most problematic third-party resources needing remediation across a website
- Prioritize which pages and resources to focus on first
- Find out which domain servers are responsible for the greatest performance issues
Third-Party Resource Analytics
Because SOASTA’s DSWB now offers deep analytics and powerful visualization capabilities for third-party resources, digital businesses have the ability to easily manage third-party resources which were previously out of their control and negatively impacted site and application performance.
“Today, most commercial web and mobile user experiences depend heavily on third-party content to deliver everything from syndicated content to video to advertising to interaction with social media,” explained SOASTA Executive Chairman and Founder Ken Gardner. In fact, sites often have 50 percent or more of their resources originating from third parties, creating persistent challenges in performance management.
“It is quite common for us to discover that customers and prospects don’t know how many third-party resources are being used and where, much less the impact of those resources on site performance and, ultimately, user outcomes,” Gardner explained. “In other words, if you’re not managing the performance of third parties, you’re not managing your company’s digital performance.”
SOASTA third-party analytics addresses this challenge by analyzing detailed information about each resource loaded on a page, the domain that served it, the type and size of the resource, and performance information for that resource. Resources served from third-party domains are available for analysis together with or separate from first-party domains.
Conversion Impact Score and Activity Impact Score
The Conversion Impact Score and Activity Impact Score offer the industry’s most effective method to prioritize page group optimization by user sensitivity to performance, relative to conversion and session length. Both visualizations provide clear guidance to teams wanting to identify the highest-priority pages for remediation and fix the pages most important to their company’s business first.
“Marketing and eCommerce teams that look to customer data and analytics to drive better marketing, customer experiences and sale completion rates know that along with content, offers and design, app performance also drives customer success,” wrote Milan Hanson and James McCormick of Forrester Research in the February 2016 report, Brief: Take Application Performance To The Next Level With Digital Performance Management. With the Conversion Impact and Activity Impact scores, any company relying on marketing campaigns and conversion for their revenue streams has a powerful option that connects IT and business segments around their business goals.
Session Path Analysis
Session Path Analysis shows the paths users are taking through an application and the performance of those pathways. It helps customers understand how users enter and navigate through a site while also illustrating the performance impact of each page in the session. This function offers value by allowing teams to understand how performance affects user sessions, for both users converting or users abandoning the site.
Predictive Analytics
DSWB’s predictive analytics capabilities are based on historical trends within a company’s web properties. Tolerance bands are modeled from a company’s entire performance analytics history and drive “smart alerting,” allowing IT Ops and Marketing teams tracking campaigns to make better decisions in real time about performance remediation and changes to campaigns as they run. This is accomplished by comparing real-time data to machine learning models benchmarked from a company’s entire performance history, regardless of industry. According to the Forrester report cited above, “Firms want predictive analytics that bind together performance and business data to produce actionable insights for business decisions.”
The Latest
Outages aren't new. What's new is how quickly they spread across systems, vendors, regions and customer workflows. The moment that performance degrades, expectations escalate fast. In today's always-on environment, an outage isn't just a technical event. It's a trust event ...
Most organizations approach OpenTelemetry as a collection of individual tools they need to assemble from scratch. This view misses the bigger picture. OpenTelemetry is a complete telemetry framework with composable components that address specific problems at different stages of organizational maturity. You start with what you need today and adopt additional pieces as your observability practices evolve ...
One of the earliest lessons I learned from architecting throughput-heavy services is that simplicity wins repeatedly: fewer moving parts, loosely coupled execution (fewer synchronous calls), and precise timing metering. You want data and decisions to travel the shortest possible path. The goal is to build a system where every strategy and each line of code (contention is the key metric) complements the decision trees ...
As discussions around AI "autonomous coworkers" accelerate, many industry projections assume that agents will soon operate alongside human staff in making decisions, taking actions, and managing tasks with minimal oversight. But a growing number of critics (including some of the developers building these systems) argue that the industry still has a long way to go to be able to treat AI agents like fully trusted teammates ...
Enterprise AI has entered a transformational phase where, according to Digitate's recently released survey, Agentic AI and the Future of Enterprise IT, companies are moving beyond traditional automation toward Agentic AI systems designed to reason, adapt, and collaborate alongside human teams ...
The numbers back this urgency up. A recent Zapier survey shows that 92% of enterprises now treat AI as a top priority. Leaders want it, and teams are clamoring for it. But if you look closer at the operations of these companies, you see a different picture. The rollout is slow. The results are often delayed. There's a disconnect between what leaders want and what their technical infrastructure can handle ...
Kyndryl's 2025 Readiness Report revealed that 61% of global business and technology leaders report increasing pressure from boards and regulators to prove AI's ROI. As the technology evolves and expectations continue to rise, leaders are compelled to generate and prove impact before scaling further. This will lead to a decisive turning point in 2026 ...
Cloudflare's disruption illustrates how quickly a single provider's issue cascades into widespread exposure. Many organizations don't fully realize how tightly their systems are coupled to thirdparty services, or how quickly availability and security concerns align when those services falter ... You can't avoid these dependencies, but you can understand them ...
If you work with AI, you know this story. A model performs during testing, looks great in early reviews, works perfectly in production and then slowly loses relevance after operating for a while. Everything on the surface looks perfect — pipelines are running, predictions or recommendations are error-free, data quality checks show green; yet outcomes don't meet the ground reality. This pattern often repeats across enterprise AI programs. Take for example, a mid-sized retail banking and wealth-management firm with heavy investments in AI-powered risk analytics, fraud detection and personalized credit-decisioning systems. The model worked well for a while, but transactions increased, so did false positives by 18% ...
Basic uptime is no longer the gold standard. By 2026, network monitoring must do more than report status, it must explain performance in a hybrid-first world. Networks are no longer just static support systems; they are agile, distributed architectures that sit at the very heart of the customer experience and the business outcomes ... The following five trends represent the new standard for network health, providing a blueprint for teams to move from reactive troubleshooting to a proactive, integrated future ...