If you’re a marketer, chances are you’ve had to select a new technology vendor or replace a vendor that’s not meeting expectations.
Either way, it’s a complicated and time-consuming undertaking. Vendor evaluations involve months of research, coordinating stakeholders, sifting through vendor websites and piles of marketing collateral, comparing qualifications, services and technical details, not to mention the never-ending procession of mind-numbing presentations.
The decisions often involve high-dollar technology investments – not to mention putting your reputation on the line. Given the high stakes, many experts would advise you to conduct some kind of a/b test or experiment-based evaluation to objectively measure and compare how vendors actually perform in the field.
The idea of holding a vendor “bake-off” – allowing you to base a decision on actual performance data – is great in concept. But in reality, these tests almost never happen. Why not?
- They’re a massive resource commitment. Marketers can expect four-to-eight weeks of preparation time for every week spent in the actual testing phase. You have to select vendor participants, design the experiment, work with IT, split users into groups, perform quality checks, run the test, compare outcomes, and ensure the proper tools and rigorous processes are in place every step of the way.
- Resource intensive processes are error-prone. The smallest discrepancy in test design or execution can result in invalid results. At best, a botched test is a waste of the organization’s resources, and at worst, it creates the risk of making decisions based on bad data.
- Human emotions favor incumbent vendors. We become invested in the people and firms we work with, making change hard. Few marketers have the stomach – or political capital – to confess that they invested many months of work and tied up valuable company resources in selecting the wrong supplier.
It all sounds pretty daunting, right? Well, it’s not if you’re using Signal’s Open Data Platform.
Customers who use Signal’s platform for collecting, connecting and acting on their cross-channel data frequently tell us how it’s empowered them to take control of their data and vendor relationships.
Our patented technology significantly streamlines the testing process for our clients, and efficiently supports their efforts across the experimental spectrum, including vendor optimization programs.
For example, the web-based survey powerhouse, Survey Monkey, recently leveraged Signal’s platform to compare how retargeting vendors performed in a head-to-head competition. Choosing the vendor providing the highest ROI enabled SurveyMonkey to reduce customer acquisition cost by 33 percent.
With Signal, setting up the tightly-controlled experiment took just one week – compared to months of work that otherwise would have been required. Non-technical users at SurveyMonkey were able to create tag-firing rules, establish cookie pools, collect incredibly granular data, and run a “clean” test – all without the agonizing process of altering code on the company’s site.
“Signal gives us an advanced tool for strategically optimizing our marketing spend and increasing ROI,” said Gallant Chen, Director of Online Marketing at SurveyMonkey.
Think about it this way. Without Signal, you can spend six to nine months wrangling the complex details of a risky vendor evaluation. Or you can skip the process entirely and have no data on which to base your decision-making.
With Signal, you can easily and efficiently run four to six successful vendor tests per year, without tying up any IT resources. Imagine the freedom you’ll have to take charge of your vendor evaluation and selection processes!
Read our case study to learn more about how Signal enabled SurveyMonkey’s successful vendor test.
Originally published July 09, 2014