With so much attention on consumer data from mainstream media, it’s become almost impossible to talk about real-time bidding and audience targeting without finding yourself in a privacy debate. After all, the exploding availability of data is key to the adoption and success of real-time audience optimization. On the other hand, the commoditization of data also has privacy advocates on edge. Today, industry partners and marketers walk a fine line between leveraging data for more effective audience targeting and ensuring consumer privacy protection. As an industry, we should understand the lessons learned in dealing with privacy issues.
Create a High Value Exchange
Netflix and Amazon, both deeply rooted in the online business, leverage data to reach consumers with relevant information. These companies provide a data-driven customer experience that users find delightful rather than intrusive. For instance, recommending movies and products based on personal data collected could easily be seen as intrusive, and the recommendation process potentially places customers into a “box” (i.e., category), therefore, limiting consumer choices. You can read more about “boxing” in Martin Abram’s 2009 article in the Privacy and Data Security Law Journal.
The reality is that consumers see Netflix and Amazon recommendations as valuable, relevant information. Consumer delight features like these can be executed in a privacy friendly form of interest-based technology. Netflix and Amazon use collaborative filtering technology in their recommendation engine. Since this technology primarily works by extrapolating large numbers of data about people sharing similar product interests, it presents the recommendations to consumers in a social influence metaphor. Psychologically, this can be comforting to consumers because they associate this culturally with belonging to a group.
In digital advertising, we strive to find similar approaches that are both effective and privacy friendly. One such approach is the “act-alike model,” a close cousin to the “look-alike model.” Instead of building audience segments by simply bucketing users into demographic (age, gender, income, etc.) groups, act-alike segments are built on transient data measured on a consumer’s actions and product interests. In such an approach, diverse behavioral data from a large number of people are analyzed with collaborative filtering technology to create audience segments. With more breadth and diversity of the input data, these audience segments become more effective. More importantly, it is much less likely to “box” people in as transient data can easily expire or be opted out of.
Rules Limiting Data Collection Are Counter-Productive
While rigid rules limiting data collection might be easy to follow and enforce, they usually are less effective. A case in point is the data privacy section of HIPPA. Many people know HIPAA as the extra paperwork we fill out at the doctor’s office. HIPAA has a set of rigid rules that limit the use of the personal data it collects. For example, HIPAA curtails usage of 18 specific pieces of protected health information (PHI) such as name, social security, detailed geo-location, e-mail, URL, IP address, etc. Pieces of data outside these 18 no-go variables are essentially fair game. However, according to Professor Paul Ohm from the University of Colorado Law School, it’s quite possible to “re-identify” people by piecing together these types of non-PHI. At the same time, these limitations make it much harder for the data to be used for scientific studies, research, etc. Therefore, HIPPA has managed to decrease the value of healthcare data and increase healthcare costs, while not doing enough to ensure consumer privacy protection. A more effective framework would be the “use and obligation” guidelines proposed by the Centre for Information Policy Leadership, which focuses on data use and the associated accountability rather than simply blacklisting data types.
Use Data to Benefit the Consumer
There are some industry members out there that think the privacy issue is largely a nuisance and as long as the government is not on their case they can continue to operate business as usual. This is short-sighted. Ultimately, online data belongs to the consumers. With the increased media focus and the online industry’s own educational efforts, the general public is becoming much more aware of new technology, different companies’ privacy practices, and the availability of privacy protection tools. It’s not only the government pressure that we should worry about; losing consumers’ confidence or trust would be the most devastating to businesses.
The Internet and online service industry is a great economic engine based on innovation, and the recent advances in real-time bidding and audience management technologies provide clear evidence. Unfortunately, new technologies seen within the auction market and audience-buying ecosystem are also subject to misuse and abuse. When dealing with consumer data, we all need to walk the privacy line and take up the responsibility of safeguarding it, analyze the potential privacy harm, and design innovative solutions with privacy considerations built in.
Advertisers are more concerned than ever about brand safety, and one of the primary ways they're trying to keep their ads from appearing in unfriendly places is through whitelisting. But as more and more brands turn to whitelisting, some are talking about the impact this will have.
We all know that Facebook is a viable source of huge amounts of mobile traffic with relatively cheap CPCs). It’s too good an opportunity to ignore in today’s digital landscape - even if your mobile landing-page experience isn’t up to snuff.
For years, advertisers have tolerated a big elephant in the room: the fact that their digital ads aren't always appearing where they would want them to.
Deep learning tools are the next major area of AI-based research, and it will spark a wave of future innovation in every industry – bringing a new era of marketing which both advertisers and end-users will benefit from.