Dunge Dispatch: Seeing Through Bad Marketing Research to Tell a Good Brand Story

Many of us have seen, read, heard, or talked about the Southwest Airlines flight 1380 emergency landing that occurred earlier this week. For those who have not, here is the story

Southwest jet flying over downtown Dallas

Related to the Southwest flight 1380 incident (full disclosure: I work for an airline, just not Southwest), I was contacted by a marketing research firm which sent me their "brand intelligence report" claiming to track the reputation of Southwest and other major air carriers. Their marketing research, to put it mildly, is bad... flawed, actually, is what I told them. How did I come to this conclusion, and why did I respond to them in the first place? Please allow me to explain.

There are many stories that came out of the Southwest 1380 incident: the heroism, the cooperation, the tragic loss of life, the investigation. Each story has its own ensuing narrative. In the following paragraphs I'll outline a couple of those narratives that I find interesting or relevant, and at the end, provide more detail on the problematic brand intelligence report shared with me about Southwest that prompted this article.

First, Tammie Jo Shults. If she did not have a brand before, she certainly does now. Shout out to an incredible pilot, a fellow Navy officer, a woman of faith, a wife, a mother, and from all accounts I've read, a genuinely humble person. The crew and passengers on Southwest 1380 were in extremely capable hands. I'm certain that, whether and when she is ready, she can continue to build her brand, much in the same way USAir's Sully Sullenberger has done.

Second, Southwest Airlines. A known and trusted brand, and not just in commercial aviation. Southwest was and still is a darling case study for business school professors, beginning with its origins on the back of a cocktail napkin as a scrappy startup to becoming a formidable competitor to the major domestic airlines. All things taken into consideration, Southwest has a rough road ahead with the NTSB investigation, but if any brand can weather this storm well, Southwest can.

Which brings me to my (groan) axe to grind with bad marketing research. I even hate to say "marketing research," because it was really a bunch of disparate surveys Franken-packaged together in an attempt to market this company and their service. The premise on which they contacted me was that trust in Southwest dropped as a result of the incident. Can't argue with that, but when they mentioned they were tracking other air carriers I decided to delve into their slickly packaged research to find out more. 

Let me go through the major issues I found with their research, and subsequent "lessons to be learned" in case the firm chances upon this article, and in case marketing research colleagues want to join the discussion.

So, how do I know their research was bad? 

1. Inconsistent periods of time and number of survey respondents when tracking customer sentiment. The first period of time was a week between Apr. 10-16 with about 1300 respondents, and the second period of time was a single day, Apr. 17, with about 3500 respondents.
 -- Lesson to be learned: You gotta use the same period of time and get close to the same number of respondents. This is table stakes when dealing with marketing research. Any strategic marketer will tell you the data has to tell a consistent story, and lack of consistency in your survey data is a no-no.

2. Using Twitter as a favorability measure. Favorable vs. unfavorable Twitter mentions were compared over a ten-day period, highlighting the sudden increase in unfavorable mentions on Apr. 17.
 -- Lesson to be learned: There are two here. First, DUH! Twitter erupts whenever a celebrity has a wardrobe malfunction, so of course there will be negative sentiment when an airline has an in-flight emergency. Second, take social media and any favorable/unfavorable mentions with an enormous grain of salt. Basing favorability on pure mentions, without weeding out the outliers (such as serial complainers, infrequent users, whiners who think their fussiness entitles them to freebies), leads to disaster. Your data is only as good as the good data points... garbage in, garbage out, or so the saying goes. 

3. Not even a single attempt at regression? Really, no attempt to try and relate proprietary metrics (such as "awareness," and "trustworthiness") to purchase intent? 
 -- Lesson to be learned: Probably the best thing the firm did was not to attempt regression, considering the lackluster quality of the rest of their efforts.

So you see why I'm skeptical here. I find it humorous that with all of the effort and charts and numbers attempting to show that one brand took a hit to its "trust," the same metrics in the same report showed the now "less trusted brand" still ahead of its competition.  

In all fairness, I wish the firm that reached out to me the best of luck. As professionals, we can all strive to put forth the effort to improve our work and hopefully, our lot in life as well as that of others. If this short piece on improving marketing research methods is of benefit to the larger branding community, then I've done my part and can go home happy.

 -- Dunge

Comments

Popular posts from this blog

Dunge Dispatch: Leadership Lessons from "The Way of The Shepherd"

Dunge Dispatch: Why Every Marketing/Advertising Type Should Carry a Portfolio

Dunge Dispatch: Canceling Oktoberfest? A Marketer's Response; die Kirche im Dorf lassen, bitte