Smith Brain Trust / October 22, 2018

Ethics in a Digital Surveillance World

The Importance of Merging Ethics With Marketing Analytics

Ethics in a Digital Surveillance World

SMITH BRAIN TRUST – Tracking consumer behavior is a big job. But your smartphone has help. Additional input comes from your laptop, television, car, wallet and the voice command device in your living room. Even your kitchen appliances stay connected and alert.

“Many people don’t know the extent to which some of these datasets are combined,” says Maryland Smith professor Wendy W. Moe, dean of graduate programs and director of the Smith Analytics Consortium. “There are analysts who can aggregate all of this information, take seemingly anonymous data and ‘back out’ identities, effectively de-anonymizing the data.”

That didn’t happen 15 or 20 years ago. Television and radio signals filled the air, but advertisers had no way of knowing for certain who tuned in. People concerned about privacy simply could decline telephone surveys and shut their doors to Nielsen meters.

Even if audience members chose to participate in market research, they were clumped together into demographic categories. “Audiences in the past were anonymous, aggregated groups of people,” Moe says. “That is no longer the case.”

Marketers now track individuals. They know your name, where you live and how you spend your time. They know where you went on vacation and your political ideology. “They can take that data and predict your personal profile in terms of gender, age, education,” Moe says. “Then they push relevant content and ads to you.”

Her book, “Social Media Intelligence,” explores the data science involved. Her classes at Maryland Smith also explore the ethical issues that arise.

AN ENRON MOMENT

As Facebook CEO Mark Zuckerberg learned on July 25, 2018, consumers get angry when they feel exposed or betrayed. His company lost nearly $120 billion in a record-setting plunge that started the previous evening in after-hours trading.

Smith entrepreneurship professor David A. Kirsch links some of the turmoil to revelations about Cambridge Analytica, a political consulting firm that used Facebook data to influence the 2016 presidential election. But Kirsch sees a larger reckoning taking place in Silicon Valley, where public sentiment has shifted against high-growth tech darlings.

“All of these big tech companies have been living a charmed life when it comes to their public image,” Kirsch says.

Big tobacco, big oil, big pharmaceuticals, big banks and big agriculture have taken their hits over the years. But big tech largely has escaped the wrath.

“Where has big tech been in all of this?” Kirsch asks. “Just sitting with a big Cheshire Cat grin on their faces, laughing all the way to the bank.”

Recent judgments against Google, chaos at Tesla and fallout at Facebook may signal an end to the charmed life — especially for companies that broker in consumer information. Kirsch says questions surrounding data management and privacy will force long-term, sweeping changes.

“We are the product, and people may at some point get tired of being the product,” he says. “And if they do become tired of being the product, then the business model is no longer sustainable.”

Alex Yoder, analytics executive vice president at performance marketing agency Merkle, sees a watershed moment for the industry.

“The Cambridge Analytica scandal is beginning to feel for consumer privacy like the scandals at Enron, WorldCom, Global Crossing, Tyco and Arthur Andersen felt for consumer trust in accounting practices — and which resulted in the Sarbanes-Oxley Act of 2002,” he says.

As a charter member of the Smith Analytics Consortium, Columbia-Md.-based Merkle helps prepare Smith students for the disruption. 

“It seems logical that entirely new companies and industries could emerge out of the recent public outcry stemming from concerns about consumer privacy, audience targeting and marketing efforts designed to influence consumer behavior,” Yoder says.

BALANCING ACT

The challenge for free or freemium platforms is balancing the interests of paying customers — the advertisers who want as much consumer information as possible — with the privacy interests of end users. Moe says solutions start with dialogue.

“There needs to be an effort to educate everyone on how this works,” she says. “Because I think there’s a big gap in our understanding of what’s going on.”

Many consumers, for example, overlook the benefits of targeted advertising. Besides customized content, they get free or subsidized services like email, video calling, photo sharing and cloud storage. Most people never pay a cent to Facebook, Google, Craigslist, LinkedIn, Snapchat or other tech services, yet they enjoy the connectivity offered.

“In a world where targeted advertising is turned off, these companies would have to charge some sort of subscription,” Moe says.

Privacy advocates might not mind paying a little extra if it meant freedom from digital surveillance.

Some of the downsides of targeted advertising are merely annoying — like when a parent goes on her personal iPad to shop for a child’s birthday gift, and related banner ads start showing up on other family devices, spoiling the surprise.

Other potential scenarios are more invasive. Health care organizations, for example, already know when you visit the doctor. If they match that information with your prescription refills, gym memberships and other key metrics, they could practice price discrimination based on the statistical probability of your next medical event.

“Are you comfortable with your health insurance company knowing whether or not you work out, and then giving you a different price based on the information?” Moe asks.

Students in the school’s Master of Science in Marketing Analytics program tackle such questions in a required ethics course led by Moe. They learn the general framework of ethical decision making and then delve into the data collection issue.

Moe says ethical business starts with honesty, so buyers and sellers understand the trade they are making. In data collection this means avoiding games of semantics. If a company promises not to “read” your email, for example, it should not turn around and scan it digitally for keywords.

To get the conversation started in her classroom, Moe challenges students to design an algorithm that would maximize short-term profits in a given scenario like health care. Then the students take a step back and consider the ethical implications.

“It becomes very clear that the best algorithms in terms of profitability will oftentimes step on the toes of privacy issues and data security issues,” Moe says. “We talk about that tradeoff.”

BLAME IT ON THE ALGORITHMS

Moe and her students discuss at least five dilemmas that marketing analysts confront as they move from the classroom to the corporate world. Intentionally or otherwise, available data may be used in the following ways:

1. Polarizing Communities

People don’t need social media to divide themselves into echo chambers. They have been doing that for centuries on their own. But marketing algorithms, which identify and cater to consumer preferences, accelerate and exasperate the process.

“If you have a consumer profiled to be a certain kind of buyer, and you keep feeding this person content related to that interest, then opportunities for cross-selling are missed and exposure to new products and ideas are limited,” Moe says. “A marketing filter bubble emerges, which narrows your choices.”

Something similar happens in the realm of political discourse. “If you start clicking on left-wing content or right-wing content, you get more and more of that content,” Moe says. “You get profiled as a left-winger or right-winger, and you stop seeing the other side.”

2. Promoting Bias

Algorithms may also inadvertently promote bias. Unlike humans, who insert emotion into the decision-making process, robots predict the future by looking impassively at the past.

That’s how machine learning works. But if the past includes discrimination based on race or some other protected identifier, then the bias gets carried forward in the code. “Not because companies are maliciously trying to discriminate,” Moe says. “But because they’re building models that mirror the human nature exhibited and reflected in the data.”

She cites one example at Airbnb, a vacation housing broker that uses algorithms to recommend prices to property owners. If past rental decisions reflect bias against landlords of color, driving down demand in certain neighborhoods or at certain properties, then recommended prices also come down.

“All of that bias gets built into the algorithms,” Moe says.

3. Widening Income Inequality

Another type of discrimination occurs when data analysts use algorithms to identify high-value consumers. “Everyone is trying to get those customers, so the competition is higher,” Moe says. “Marketers have to offer better deals and more choices.”

This works to your advantage if you have resources. But the opposite is true for consumers on the fringes of the economy.

A similar dynamic happens in traditional markets, which leads to issues like food deserts. “People in poor, urban locations have fewer shopping choices,” Moe says. “Not a lot of companies try to compete for them.”

Separating the haves from the have-nots is easier online, where marketers have access to massive amounts of consumer data. “The people who can least afford it pay higher prices for things because they’ve been profiled,” Moe says.

4. Feeding Addictions

Marketing data may also be used to squeeze extra profits from addicts. A bar eventually will close or stop serving drinks. But algorithms do not ease up when they detect unhealthy or compulsive behavior.

Tinder won’t remind you that you’re already married. Amazon won’t warn you that you can’t afford another collectible toy. And Netflix won’t stop streaming the next episode of “Breaking Bad” just because it’s 2 a.m. on a workday.

Moe, who has published research on binge watching, says the Internet is a judgment-free zone that simply detects user preferences and delivers matching content.

“If you have a gambling addiction or video gaming addiction, you’re going to exhibit behaviors that indicate high interest — which will generate more and more targeted ads toward you for those topics,” Moe says. “And that’s a problem.”

5. Exploiting Children

Children and other vulnerable populations may be especially susceptible to the power of algorithms. “You can have a healthy conversation about how much data privacy is good for most of the adult population,” Moe says. “But the same algorithms that track adults also track minors.”

CALLS FOR TRANSPARENCY

Moe takes certain steps to maintain a degree of anonymity on her own devices. She avoids the cloud as much as possible. She periodically removes cookies and clears browsing histories. And she mixes up her online viewing habits to ward off filter bubbles.

“It’s an active process for me to try and train the algorithms,” she says. “I will click on things that I might not actually be interested in, so I always have that broad swath view.”

The precautions work to some extent. But Moe says the only way to stop being tracked completely is to detach from the digital world. That’s tricky considering the reach of big tech.

“I don’t think it’s feasible at this point,” Moe says. “Consumers don’t have a lot of freedom to protect their own privacy.”

Europe has intervened on behalf of consumers with new regulations. The General Data Protection Regulation (GDPR), which took effect in May 2018 following a two-year grace period, cracks down on how companies collect and share data.

U.S. policymakers may soon follow, requiring organizations to be up front about the data they collect and how they use it. If done right, Moe says, regulations could promote fair competition by establishing the same high standards for all organizations.

“Many companies want to be socially responsible,” she says. “But if they do that, they take a hit to their profits because they lose to those who are not. If they are regulated, then it’s an even playing field for everyone.”

The goal would be transparency, so consumers understand the tradeoffs they make when they download an app, participate in a store loyalty program, use a bank card or create a social media profile.

“The very first thing that regulators can do, and it’s what the Europeans have done, is to increase transparency,” she says.

DELIGHTING CUSTOMERS

Merkle already has seen the legislative impact. “Not only do we have GDPR in place in Europe, but California just passed the toughest data privacy law in the U.S., and other states are looking at similar legislation,” Yoder says.

Sarbanes-Oxley disrupted all forms of general accounting, and Yoder predicts something similar in marketing analytics. “New practices and companies that provide oversight will emerge,” he says. “Very clear processes must be in place for direct legal oversight, operational exactitude and audit capability.”

He says ethical and legal debates will revolve around who ultimately owns the data. If website owners have the same rights as real property owners, for example, does that mean they can open their property to the public and observe the behavior of individuals who choose to visit?

Regardless of what emerges in terms of regulation, Yoder says, companies that thrive in the new environment will be those that remember marketing’s oldest rule: The customer is always right.

“What only the most advanced brands are realizing is that it’s not about knowing the individual as much as it is about offering a brand experience that is relevant, coherent and valuable,” he says. “There should be a perceived exchange in value that compels the customer to engage willingly and openly.”

He says this objective can be achieved without violating consumer privacy. “The key is to focus on using data to surprise and delight consumers with messages, offers and other content that resonates with their interests,” he says. “Our expertise in this space will enable our clients to navigate new regulations with less disruption.”

CHANGEMAKERS

Yoder says business schools like Smith have an important role to play in the process.

“As the training ground for future leaders of industry, it is up to institutions like the Smith School to prepare students to tackle questions about where it is appropriate to apply big data and AI,” he says. “These topics require ethical rigor and institutional oversight to be fully exercised.”

Moe says her focus is to teach students to think for themselves and to frame their arguments persuasively, so they come to the workplace prepared to lead change.

“Laws and regulations are not going to tell them what to do right now,” she says. “The goal in our program is to teach students how to think through the issues and how to lead team discussions.”

Learning to speak up in a group is key because the modern workplace increasingly emphasizes collaboration. “None of these decisions are made by a single person,” Moe says. “They are always made in teams.”

GET SMITH BRAIN TRUST DELIVERED
TO YOUR INBOX EVERY WEEK

SUBSCRIBE NOW

Media Contact

Greg Muraski
Media Relations Manager
301-405-5283  
301-892-0973 Mobile
gmuraski@umd.edu 

Get Smith Brain Trust Delivered To Your Inbox Every Week

Business moves fast in the 21st century. Stay one step ahead with bite-sized business insights from the Smith School's world-class faculty.

Subscribe Now

Read More Research

Back to Top