When Humans Act Like Machines
SMITH BRAIN TRUST – Apple got caught in 2013. Now it’s Facebook’s turn. A new report from The Center for Investigative Reporting accuses the social media giant of knowingly duping children and their parents out of money to boost revenue from online games like Angry Birds.
Internal Facebook memos, uncovered in a class-action lawsuit, refer to the underage targets as “FF-minors” — short for “friendly fraud” involving a minor.
“The fact that ‘friendly fraud’ is a term being used at all is problematic, regardless if it’s a child or an adult,” says Wendy W. Moe, who teaches ethics in her marketing analytics courses at the University of Maryland’s Robert H. Smith School of Business. “Friendly fraud means playing to the weaknesses of human nature. It’s not lying, but you know where the weaknesses are and you take advantage of that.”
Some children blew thousands of dollars before their parents received monthly credit card statements and realized what was going on. Facebook called these kids “whales” and sometimes refused to refund money to their families.
“That’s a casino term for high rollers,” Moe says. “Someone in the organization brought that culture over to Facebook.”
Moe, who has written a book and published multiple studies on social media intelligence, says the Apple and Facebook cases show a potential problem with anonymous data tracking.
Unlike humans, who insert emotion into the decision-making process, algorithms look impassively at vulnerable populations. It doesn’t matter who sits at the keyboard. Money from a child or an adult looks the same to a smart machine designed to maximize profit. A dollar is a dollar.
“When companies track people online with data, they’re not really distinguishing between children and adults,” Moe says. “A 5-year-old is being tracked just the same way as an adult.” This is the opposite of Las Vegas rules, where children get stopped before they sit down at a slot machine or poker table.
“Companies need to bring humanity back into the mix,” Moe says. In the Master of Science in Marketing Analytics program at Maryland Smith, she has students design algorithms with a single target objective: Maximize short-term profits. Then she has the students pause and consider the ethical implications.
“There needs to be a human level to it,” Moe says. “You need to put a check on the machine.”
Humans at Facebook discovered the problems with child exploitation and spoke up, but higher-ups in the organization opted to ignore the check. “Someone caught it,” Moe says. “But the humans were acting too much like machines — looking impassively at vulnerable populations.”
GET SMITH BRAIN TRUST DELIVERED
TO YOUR INBOX EVERY WEEK