Dealership owners are becoming more and more progressive and data-driven. Like leadership at Fortune 500 companies, they make executive decisions based upon internal results and exterior data to improve their stores’ fortunes. However, I’m issuing a stern warning when making changes to your policies, processes, and personnel:
Beware of Companies that Massage the Data
Unlike my partner, Bill Playford, I don’t have the mind of a mathematician. I’m a marketing man, so I don’t analyze the statistics myself, but rather, use already-vetted statistics to develop curriculums and tactics. What I have learned from heeding the advice of the human calculator I have for a partner is that all data can be manipulated to prove a point, if no outside agency is overseeing the statistical analysis.
I myself was warned before posting this that it is in my best interest to not start a war, and be somewhat anonymous in my finger-pointing. I’ll meet that warning only halfway because what I care about are my dealer clients, not my relationships with vendors.
1) Using the Exception as a Benchmark
Two training firms were “approved” by a certain OEM to assist in BDC performance nationwide for their brands. They have conducted many workshops in different cities to both educate the dealer attendees, and to inevitably sell their services to them. After all, isn’t that what a nod from an OEM is supposed to yield? My clients that attended these workshops alerted me of the recommendations and benchmarks shared by these two agencies. They claimed “If your BDC isn’t setting appointments at an 80% clip, you’re doing it wrong.”
That is a reckless statistic to throw out. Beyond the results our DealerKnows clients have achieved, which are both realistic AND that I’m very proud of, other phone trainers, sales trainers, and consultants we associate with all find that 80% metric to be the exception, not the rule. One of the main phone trainers in the nation stated he has only ever had a few dealers to reach that pinnacle. After 8+ years with DealerKnows, we’re proud to say we have had two clients reach that level. And no one stays there permanently. So many variables affect that 80% set ratio figure that claiming it as the rule rather than the exception sets unrealistic expectations as a standard that should be easily met, when, alas, it is amazingly difficult. No respectable training firm should set a benchmark at what they likely only had a handful of dealers ever reach. They should take the aggregate median and set that as an average rate. They’re sending dealers back home to confront their high volume BDCs as to why they only exceed 40% appt sets across leads and calls and demand them to double production. It is a crime. More realistic data analysis should be performed before any OEM buys in, let alone allows them to share it as a norm. They likely cost good BDC agents their jobs by shamefully blurting out the concept that every BDC should hit an 80% appt set ratio, for no other reasons than to make themselves look attractive to the blind.
2) The Skewed Data
A CRM that has a considerable market share released a study as to how dealers should engage with their online prospects/leads. I, being a process consultant and lead management junkie by trade, became ecstatic at the prospect of learning new information. After all, we’ve graded the lifecycle performance of over 20,000 leads, and analyzed over 100,000 in the last few years. One specific metric this corporation shared stated that the best time to get a customer to open an email from a dealership is in the morning. I found this interesting as our data shows otherwise. After some digging, it turns out that their technology ONLY ALLOWS AUTOMATED EMAILS TO BE SENT IN THE MORNING! How corrupt is that data when the ONLY data point they have regarding email sent vs. read rate is broken at the outset by their antiquated technology? How many dealers have read this study and incorrectly altered their lead management process to accommodate this data, only to find it harm their contact percentages? Why…because a CRM with older software shares information that is faulty from the get-go, just for the sake of covering their own ass? Until a vendor’s technology allows data to be analyzed in an A/B testing phase (or multi-variant phase), then there is no way they should be allowed to share “option A” as the gold standard of lead management. Again… reckless.
3) Re-Gifting the Data
As dealers concern themselves with web traffic, they rely on Google Analytics, but few are GA certified. So they lean on their website providers or, gulp, a SEO/SEM agency to improve their search performance. DealerKnows catches far too many of these digital marketing scoundrels playing three card monty with the dealers PPC money, not attaching their Adwords account to the Analytics account. Moreover, the “performance report” (phrase used loosely) that they send to dealers as a means to justify their pay is nothing more than the dealers’ very own Google Analytics account, or maybe their website analytics in a slightly prettier format. They’re re-gifting factual data and sharing it with you, but not providing any details regarding work performed, GA ads purchased, keywords used, etc that they could correlate to traffic. If someone is handling your SEM, they better have sophisticated tech showing CPC, multi-variant keyword strategies, retargeting, display results, and more – and follow it through down to the conversion. If someone is “handling” your SEO, you should require they share with you every piece of content they created, every link they added on and off your site, and every title tag they changed to improve performance. Otherwise, they’re likely just sitting back, letting your website work as it was initially built, and propping up your own Google Analytics results as if they were a contributing factor to any positive growth, when they weren’t.
4) The Lack of Correlation
A large automotive marketplace that dealers pay to place their inventory on offers to perform a sourcing study for their dealer clients (usually as soon as the dealership asks to drop them). They use this sourcing study as a means to justify their worth to the dealership organization. While they use a 3rd party agency to conduct the study, garnering dealer DMS/Sold data and contacting customers from your store, they don’t ask all of the right questions. To paraphrase, they’ll ask a subset of your recent sold customers,
“Did you peruse (insert automotive marketplace name here) when researching for your car?”
“Did you go into (insert dealership name here) to buy your vehicle?”
That does NOT show correlation. Just researching vehicles on a site doesn’t predict a value to your dealership. Why wouldn’t they ask,
“Did you find the exact vehicle you purchased at (insert dealer name here) on (insert automotive marketplace name here) and choose the dealership because of this finding?”
They don’t ask, because more than likely the customer never saw your vehicle on the marketplace. They saw a lot of vehicles, but maybe not the one they bought. Maybe not even one from your store. Maybe other research led them to you. Any well built conversion goal path built in a GA account will tell you that it is rarely a direct click path for the customer. If I take an elevator from the 1st floor to the 3rd floor, get off to use the water fountain on the 3rd floor, and take the elevator back up to the 5th floor where I need to be, what deserves credit for getting me there? The 3rd floor where the water fountain is? Or the elevator?
Per Google, customers are researching 18.2 difference resources before transacting on a vehicle. Why should you have to pay a significant percentage of your digital advertising budget to a website/inventory marketplace if they can’t correlate the customers’ unique interest in your specific vehicle with the sale of said vehicle?
You shouldn’t. It doesn’t compute.
Every decision you make should be supported by data. Every new tactic or policy change you enact should have vetted information supporting it. Or, at the least, make sure the company analyzing the data doesn’t have a benefit in the outcome. It is far too easy for data to be massaged in an effort to sell you on an ideology, belief, or, more likely, service. Do your homework. Predict outcomes. Verify findings.
Everyone loves a massage, but not when what is being shifted is the data, and what is being greased is their palms.