In a big-data era, marketers who strive to be close must avoid being too close
SMITH BRAIN TRUST – The marketing was innovative. It was clever. It was bold enough to warrant coverage in The Wall Street Journal. It was also ... creepy.
Pregnant women around the country had received a greeting card, stuffed with thoughtful gift cards for the mom-to-be. The seemingly handwritten card said only, “So excited for you! Hope you like these.” It was signed with a heart and the name “Jen.”
Jen wasn’t a friend or a colleague or a relative, but part of a personalized marketing stunt by Utah-based Mothers Lounge, which sells products aimed at pregnant women and new moms.
“Technology brought us here,” says Henry C. Boyd III, clinical professor of marketing at the University of Maryland’s Robert H. Smith School of Business. “It brought us to this ‘creepy’ point. And it offers some important lessons for marketers with access to big data.”
Boyd imagines what it would be like to be on the receiving end of that envelope. “Here you are, you’re 24, you’re Rebecca, you’re a mom-to-be, and you get this note. You’re wondering, ‘Who’s Jen? Do I know Jen? Is Jen someone from work? Someone from college?’ And you want to figure it out so you know who gave it to you. You want to say thanks.”
While many people have become accustomed to overly familiar emails from marketers, this message arrived via snail mail, Boyd says, “the old way of doing things. So you think it’s really got to be a friend, because they’ve used that modality. But you ask around, and none of the Jens you know sent the card.”
Eventually you peel back the layers; you realize that the gift cards can cover the costs of certain baby products, but with suspiciously exorbitant shipping costs. “And that’s when you realize, Jen’s not a friend. Jen is a business entity,” Boyd says.
Jen is trying to sell you something, and Jen somehow knows you’re pregnant.
“And now you have this sense that washes over you and it’s uneasiness,” Boyd says. “It’s a sense that, ‘Whoa, my rights have been violated.’ Something doesn’t feel quite right here.”
In his marketing courses, Boyd talks a lot about forming relationships with customers, about offering opportunities and value. He talks about CRM – Customer Relationship Management – building profiles, collecting data, angling for loyalty.
“But at some point,” he says “you have to ask, are you violating the customer’s personal space? Are you too close, with all of the data we’re collecting?”
In some ways, Boyd says, marketers have always had to walk the line between close and too close. “Take the classic example, from decades ago, when a salesperson would come to your door.”
But door-to-door salesmen followed a certain protocol, he says, while the “Jen” campaign aimed to slide past the recipient’s defenses. “And that’s why people are feeling this sense of uneasiness, this creepiness, this sense that ‘Oh, my god. They’ve gone too far.’”
In a big-data era, when algorithms can accurately predict a consumer’s next move, it requires a combination of data-driven intelligence and emotional intelligence to sense where the boundaries are.
“Marketers have so much data and are asking, ‘How can I get out in front of my competition and get an edge?’ But if you push it too hard, big brother isn’t just watching, big brother is lurking. That’s a problem.”
He says a question that marketers likely will have to face going forward is whether to offer consumers an option to opt out of targeted campaigns.
Boyd says the Federal Trade Commission Act and laws in several states seek to protect consumers from deceptive marketing tactics. “And I think those laws are going to end up being a linchpin in some of these too-personal marketing tactics,” says Boyd. “It will likely come down to whether a reasonable person might know they are being targeted with a marketing campaign – and not being contacted by a friend.”
To his marketing students, Boyd has this suggestion: Take a deep breath, step back and ask, Are you going too far?
“We are learning this as we go along. And, of course, this is going to be an ongoing conversation. We’re only at the tip of the iceberg,” he says.
“Because with the data, the machines are tracking you, they’re monitoring you. And in some sense they’re trying to help you, but sometimes it’s a little too helpful and that’s unsettling.”
GET SMITH BRAIN TRUST DELIVERED
TO YOUR INBOX EVERY WEEK