Congratulations. You bought into Big Data and it’s paying off Big Time. You slice, dice, parse and process every screen-stroke, clickstream, Like, tweet and touch point that matters to your enterprise. You now know exactly who your best — and worst — customers, clients, employees and partners are. Knowledge is power. But what kind of power does all that knowledge buy?
Big Data creates Big Dilemmas. Greater knowledge of customers creates new potential and power to discriminate. Big Data — and its associated analytics — dramatically increase both the dimensionality and degrees of freedom for detailed discrimination. So where, in your corporate culture and strategy, does value-added personalization and segmentation end and harmful discrimination begin?
Let’s say, for example, that your segmentation data tells you the following:
- Your most profitable customers by far are single women between the ages of 34 and 55 closely followed by “happily married” women with at least one child. Divorced women are slightly more profitable than “never marrieds.” Gay males — single and in relationships — are also disproportionately profitable. The “sweet spot” is urban and 28 to 50. These segments collectively account for roughly two-thirds of your profitability. (Unexpected factoid: Your most profitable customers are overwhelmingly Amazon Prime subscriber. What might that mean?)
Going more granular, as Big Data does, offers even sharper ethno-geographic insight into customer behavior and influence:
- Single Asian, Hispanic, and African-American women with urban post codes are most likely to complain about product and service quality to the company. Asian and Hispanic complainers happy with resolution/refund tend to be in the top quintile of profitability. African-American women do not.
- Suburban Caucasian mothers are most likely to use social media to share their complaints, followed closely by Asian and Hispanic mothers. But if resolved early, they’ll promote the firm’s responsiveness online.
- Gay urban males receiving special discounts and promotions are the most effective at driving traffic to your sites.
My point here is that these data are explicit, compelling and undeniable. But how should sophisticated marketers and merchandisers use them?
Campaigns, promotions and loyalty programs targeting women and gay males seem obvious. But should Asian, Hispanic and white females enjoy preferential treatment over African-American women when resolving complaints? After all, they tend to be both more profitable and measurably more willing to effectively use social media. Does it make more marketing sense encouraging African-American female customers to become more social media savvy? Or are resources better invested in getting more from one’s best customers? Similarly, how much effort and ingenuity flow should go into making more gay male customers better social media evangelists? What kinds of offers and promotions could go viral on their networks?
Conversely, an immediate way to cut costs and preserve resources might be to discourage the least profitable and most costly shoppers. Are there clever ways to raise prices, minimize access or otherwise manage those expensive older, single customers and high-maintenance, low-purchase young females? Are there fast, cheap de-markers —such as, say, an Amazon Prime membership combined with age and ethnicity — to quickly qualify prospects worth cultivating?
Of course, the difference between price discrimination and discrimination positively correlated with gender, ethnicity, geography, class, personality and/or technological fluency is vanishingly small. Indeed, the entire epistemological underpinning of Big Data for business is that it cost-effectively makes informed segmentation and personalization possible.
But the law, ethics and economics leave unclear where value-added personalization and segmentation end and harmful discrimination begins. Does promotionally privileging gay male customers inherently and unfairly discriminate against their straight counterparts? Is it good business — let alone fair — to withhold special offers from African-American women because, statistically and probabilistically, they are demonstrably less profitable than Asian and Hispanic female customers?
Big Data analytics renders these questions less hypothetical than tactical, practical and strategic. In theory and practice, Big Data digitally transmutes cultural clichés and stereotypes into empirically verifiable data sets. Combine those data with the computational protocols of “Nate Silver-ian” predictive analytics and organizations worldwide have the ability — the obligation? — to innovatively, cost-effectively and profitably segment/discriminate their customers and clients.
Of course, many regulated industries — notably health insurance, financial services, employment — expressly forbid certain kinds of discrimination. In effect, companies and organizations are required to deliberately ignore or exclude potentially valuable customer information. In America, for example, particular attention is paid to processes and programs that have “disparate impact” on employment. (A future post will address this). Regulators, legislators and court systems worldwide are frequently on the lookout for examples of “disparate impact” and unequal/unfair treatment of customers. There should be no doubt that the intimate customer knowledge that Big Data confers guarantees greater scrutiny from governments worldwide.
But the main source of concern won’t be privacy, per se — it will be whether and how companies and organizations like your own use Big Data analytics to justify their segmentation/personalization/discrimination strategies. The more effective Big Data analytics are in profitably segmenting and serving customers, the more likely those algorithms will be audited by regulators or litigators.
Tomorrow’s Big Data challenge isn’t technical; it’s whether managements have algorithms and analytics that are both fairly transparent and transparently fair. Big Data champions and practitioners had better be discriminating about how discriminating they want to be.
By Michael Schrage.