A Tale of Two Orgs: A Deep Dive into WWT's Product Testing Evolution
In this blog
When it comes to product selection, and all the research and testing that goes with it, how can we be certain the right solutions are chosen?
With such a saturated marketplace, finding effective solutions that optimize your existing IT portfolio and complement the enterprise ecosystem can be daunting. No more can we rely on industry reports and original equipment manufacturer (OEM)-led proof of concept (POC) testing alone. That is why it's time we reevaluate the way we evaluate our solution options. Here, I'll use two customer examples to show that taking a balanced, holistic approach is the way to go, and how important it is to first get absolutely clear about desired outcomes and then let your business goals drive the entire process until those outcomes are achieved.
The importance of business outcomes
Ideas are exciting. But what good are they unless we spin them into measurable outcomes? Our success, as leaders, teams, IT organizations, and ultimately the business at large, is measured by the goals we achieve and how efficiently and consistently we achieve them.
Determining your product procurement goals, or outcomes, is far more complicated than it used to be. A sign of simpler times, I was very narrow-minded about outcomes during my early years in technology. My goals were as simple as installing new tools, rolling out advanced features, or consolidating a firewall; they were task-oriented and easily achievable. Goal set, outcome achieved, no problem. But that was back then.
We are now in a time of rapid transformation, where cyber attackers and solution developers are in a cutthroat race to outdo each other. We are dealing with an extremely volatile threat landscape, with malicious email threats up 600 percent and ransomware attacks occurring nearly every 11 seconds, causing damages that could cost tens of millions per hit. All of this, while organizations are struggling with skilled labor shortages, challenging regulatory changes and limiting budgets.
Our old habit of going too deep, too fast by focusing primarily on technical capabilities of a product or solution just won't cut it anymore. We need to look at the needs of the organization as a whole, taking into account tangible business outcomes, rather than break/fix resolutions, particularly when it comes to product comparisons and solution validation.
Our problems today are vastly different than what they once were and therefore require different approaches and new solutions. We need smarter frameworks and methodologies and innovative tools to accelerate us towards achieving more sophisticated business outcomes. And the answer lies in restructuring the product evaluation methodology to include a business value component. Usually, the testing is done by the technologists who are removed from the business end of the operation, so we need to figure out how to ensure the right questions are being asked early and often throughout the entire process from idea to outcome. Some of these questions may include:
- How can we use IT transformation to save us money?
- How can new solutions increase the velocity of my CI/CD (continuous improvement/continuous development) pipeline?
- How can a new product become a value-add rather than simply another stopgap?
- How can we reduce our risk profile, but also from a resiliency perspective?
- From a security standpoint, how can we ensure that the systems that we are securing are becoming more resilient to attack and resilient to outages?
Pardon my cynicism
When customers come to WWT for product evaluation, two variables are often mentioned during initial conversations about product efficacy; they are OEM-led POCs and industry reports (Gartner, Forrester, NSS Labs, MITRE, etc.). Now, don't get me wrong, both can help contribute to well-informed decision-making; but, your IT environment is a unique and complex landscape that simply cannot be guided by those variables alone.
Industry reports offer valuable statistics and insights, for sure. But unfortunately, some of these sources are on the payroll of the manufacturers they cover. In any case, the reports — usually written by business analysts — can only show you facts and figures unrelated to your organization's technology requirements and business needs.
When it comes to OEM-led POCs, well, they are strategically built into the sales cycle, and buyers have become accustomed to them. But I'm coming at you from a vendor-neutral perspective, so the biased slant is hard for me to ignore. To test products in isolation with test plans written by the manufacturer doing the testing may be a perfectly viable approach for feature validation of a product, but how can that assure you maximum efficacy within your environment? It's a one-sided approach, to say the least.
Through experience and reflection come transformation
My first big ah-ha moment in product testing was nearly a decade ago at WWT when an organization came to my team asking to help them evaluate a few endpoint detection and response (EDR) solutions to replace their current EPP platform. They told us that they had an incumbent product and wanted to test alternatives from two different OEMs, and what can we do for them. For this, we knew immediately, an OEM-led approach simply wouldn't cut it.
Now, keep in mind that this was before WWT had a malware lab, the cyber range, or its other various testbeds. So, we did what we had to do and built our own lab to perform head-to-head testing. Imagine us, rolling servers around on carts, running cables between rooms, downloading malware samples via Cradlepoint nodes, etc. It was not pretty, but we got the job done. We completed the testing several weeks sooner than the customer could have alone, and for that they were thrilled.
But my team and I did not share in the excitement. Rather, when the project wrapped, I remember feeling unfulfilled, as though we could have done so much more to support this customer's solution integration as a whole. We performed their validation testing as a mere point in time yet took no part in what went on before or after the product testing.
At WWT it's not just about the sale for us, it's about stewardship. So, knowing we could have brought more to the table caused us to reevaluate our approach to product testing. We had to ask ourselves: how could we add more value for this customer before and after validation testing? The answer to this question has taken shape in the form of WWT's ever-evolving Advanced Technology Center, Cyber Range, malware lab and unparalleled Lab Hosting practice.
A new, holistic approach to product testing
At WWT, successful customer engagements are measured by variables such as expense reduction, speed to market, rapid innovation, increased resilience, improved productivity and the like. These focus areas drove our evolution towards the holistic, comprehensive approach we take now. When helping customers find new products and solutions, we think in the long term, with consideration for factors like time-to-value, level of operationalization and ease of management.
WWT has found there is so much more value to gain beyond the product selection and cutover. Let's look at a more recent customer engagement as an example….
One of the world's leading financial services organizations, with hundreds of thousands of devices across multiple continents, came to WWT for product testing for migrating next-generation firewall platforms. The customer came in with a predefined testing methodology and executive advocacy for a particular OEM, which meant there wasn't a lot of leeway. However, the customer sought our industry insight and expertise in testing similar products for similar organizations, so there was plenty of room for an authentic approach.
Our first step was to dissect the customer's evaluation strategy, taking into account both technical and non-technical factors. Solution efficacy is extremely important, accounting for about 40 percent of the decision, but equally important are the technical impacts a product can have on the infrastructure — think network traffic, data center operations, internet circuits, cloud computing and cross-product functionality. When it comes to indirect impacts, we factored in aspects like user experience, portfolio impact, investment and resource rationalization, framework mapping and solution gap resolution.
Once we captured these technical and non-technical factors, we assigned weights to each of them based on customer requirements and goals, using cumulative scoring to get as granular as possible. This looks something like: "Okay, solution A is actually a better technical solution, but when it comes to meeting non-technical goals it's not a great solution for your organization, and here's why..."
We then put together, what we like to call, a "paper POC." A paper POC is a numerical evaluation of a product based solely on features and functionalities. This boils down the number of various OEM products being tested to only the top few contenders, saving time and resources, and speeding up the RFP process significantly. In this case, our SMEs performed a paper POC, cross-checking each solution's features and functions with the customer's requirements and goals, and from there we filtered the product testing count from eight choices down to only a few.
With WWT's paper POC results narrowing down the selection, the customer conducted their own RFP process with WWT acting as a silent advisor when needed. The process was focused and tactical now, allowing the customer to spend more time and be more in-depth with the select OEMs.
Once the RFP process was complete, the customer redirected the OEMs to WWT for rigorous technical testing in our lab. We customized a reference architecture to meet the customer's requirements with the right security policies, web proxies, and foundational OS versions. OEM testing was done in parallel, so we used mirrored environments that were sitting next to each other, and even ran tests at the same time so that the customer could see how each solution compared against the other.
When the testing wrapped up and we delivered the report, the customer graciously thanked us, to which we replied, "Are you ready for the next step?" They looked at us puzzled. What do you mean by next step? They thought the product testing journey ended there. But after a decade of fine-tuning our process, we believed otherwise. We know that the technical winner of all the products isn't necessarily the best choice for any given environment. For a truly successful outcome, we must dig deeper because benefit realization is quite dependent on the procurement process, implementation and operationalization of the solution. We need to zoom out of the trees to see the larger forest, so to speak – moving beyond mere technical outcomes to achieve real business outcomes like faster time to market, risk reduction and optimal resiliency.
The customer realized quickly that the technical testing was only a part of the story. They had logistical challenges due to the scale and global aspect of procurement. Additionally, all the firewalls would need new configurations applied, whether onsite or elsewhere. Such operational challenges could have caused a cascade of problems impacting the timeline, cost and overall outcome of the product integration.
WWT was granted the opportunity to support this customer in achieving ideal outcomes through a holistic process beyond product testing. We helped them with things like solution simplicity, price transparency, measurability, budget predictability, value alignment, maximum feature extraction, and most notably, solution optionality. Many times, products can resolve a short-term issue but have long-term consequences because they don't adapt well to expansion or shifts in the environment; this causes vendor lock-in, and we helped them avoid that.
Having a comprehensive understanding of the customer's end goals, WWT was able to help the customer devise and execute a clearly defined migration strategy, saving the customer time, money, and resources, and mitigating risk. We leveraged WWT's global integration centers to streamline and automate the migration process. As their devices came online, they connected to one of our global provisioning centers, were authenticated, and then migrated into the customer's environment with the latest validated rule sets from the provisioning center. The process was swift, secure and efficient. The customer chose the right product, deployed it the right way and has a well-documented plan to ensure business goals can be easily achieved well into the future.
Lessons learned
We've come a long way in a decade's time. I'm now helping facilitate product evaluations for three of the largest banks in the world. From way back when to now, here are some of the lessons I've learned along the way…
Lesson 1. Build a real world proving ground.
This is a place where you can get your hands dirty, throw some malware in there, and push it to its breaking point. Ultimately, you need an ecosystem with multiple solutions that can be tested head-to-head in an environment that's customized to meet your unique parameters. It's important to at least clone out a simple reference architecture. You need to be able to run the same tests against different solutions, and then compare and look for differences.
Lesson 2. Test the operating systems used in your server estate.
I can't say this enough: when you're applying security solutions to your estate, it's usually the server environment where the breakdown takes place. Unless you're running Windows 7 or another legacy version of a workstation OS, you're probably not going to have a ton of problems with your desktop deployment. Therefore, we typically test two flavors of Windows desktop, two flavors of Mac OS, and at least four Linux distros. But what you're going to find is that there's often a breakdown in feature parity between your Windows desktop, your Linux estate and your Mac OS – and a big portion of that is your server estate. You'll need to make sure that you're performing testing not only with user workstations but also the operating systems and configurations in your data center (where appropriate).
Lesson 3. Make sure you're using your web proxy configuration.
This is particularly true when testing things on your endpoint. Products function vastly different when they have to pass traffic through a proxy before hitting the internet. SAS-based applications often require whitelisting at the proxy level, and some of these security solutions require over a hundred top-level domains that need to be whitelisted. From an operational standpoint, that's a real challenge to implement and to manage long-term.
Lesson 4. Testing at scale is not as valuable as it may seem.
From a rigorous validation standpoint, it's extremely hard to simulate thousands of machines at scale. Then, adding in randomized user behavior makes it more complex. Often during scoping calls, customers will say, "Can you build me a lab with 20,000 endpoints in it?" And I tell them, "Sure, I could do that, but for the most part, that's going to be 20,000 endpoints that are all either sitting there idly or running a very similar script to emulate user behavior." Beyond the hefty cost and complexity of testing at that scale, and most times we are unable to deploy thousands of machines in a lab environment. The problem lies in simulating human interaction in a way that mirrors that of real humans. We've discovered it's better to go small and be thorough. When needed, bring in your red teams or pen testers for stress testing, and then extrapolate out the data to determine what the values would look like at scale.
Lesson 5. Get your SOC involved in the validation.
Even better, bring in your red teams to launch attacks, get your blue teams involved in faux investigations, and validate until absolute certainty ensues!
The final word
Now it's plain to see the product selection process, from idea to outcome, is way more sophisticated than it once was. To see the bigger picture means, that when you approach product testing, you'll be thinking not only in terms of technical evaluation and validation, but of the procurement, deployment, and implementation aspects of the product, and most importantly, its operationalization in the business. It's crucial to ask the big questions early on because they help you make the right decisions faster. With a holistic approach to product testing, you can successfully deploy at scale, ensure an ultimate user experience, fulfill wider business requirements and achieve ideal outcomes.