As we look ahead to the new year, one thing is clear: In 2016, advertisers and marketers will need to up their game when it comes to connecting with the end user. The challenge? Creating experiences that respect individual privacy rights and keep the end user engaged.
So it’s a good time to take a step back and think about whether your existing data practices deepen or hurt trust with your users. And remember—you will need to figure all of this out at a time when individuals are opting out of online ads in record numbers as the rules of the game continue to change drastically.
Here are the major privacy takeaways from 2015:
Consumers, Privacy Attitudes & Ad Blockers
In 2015, consumer privacy concerns evolved—from a generalized fear of government surveillance and commercial tracking, to specific concerns about whether personal data is collected, used, and retained.
In a March 2015 Pew Trust survey, participants revealed while they were still concerned about government surveillance in the post Snowden era, they were equally concerned about data collection and use by private, commercial actors. More pointedly, Pew participants showed a marked distrust of online advertising—76% of surveyed adults responded they were not “completely confident” that “records of their activity maintained by the online advertisers who place ads on the websites they visit will remain private and secure”.
Is there a link between the attitudes expressed in the Pew survey and the biggest concern facing digital media in 2015—namely the meteoric rise and adoption of ad blocking technology?
The numbers seem to suggest that this is the case. The trade association for online publishers, Digital Content Next (DCN), published results from a survey that found that over one third of U.S. consumers will try ad blockers in the next three months, and about a half of that number will eventually opt out of interest-based advertising altogether.
And these numbers are likely to continue to increase, with Apple’s release of a feature in iOS 9 that allows users to enable mobile ad blocking—a bit of a clunky user experience because you have to download an ad blocker app to take advantage of this blocking feature (and probably one of the reasons behind the record numbers of ad blocker app downloads after the iOS 9 release).
TUNE’s CEO, Peter Hamilton, actually thinks there is an upside to ad blocking— since consumers are essentially telling you what isn’t working for them by blocking your ads. Take a look at his tips for rethinking the digital ad experience in this December Mediapost article.
FTC Continues to Signal Strong Enforcement in 2015
2015 was a busy and significant year for the FTC.
The agency had a great win when its data security authority was affirmed by the Third Circuit—a high profile case involving Wyndham hotels (see my September blog post here for more on the Wyndham case and its potential impact). But it also suffered a setback when an administrative law judge dismissed its data security case against LabMD – arguing that the agency had failed toprove an important element of an FTC unfairness case – that the lack of security practices caused or were likely to cause “substantial injury to consumers”.
The FTC continued its record of being the world’s most active privacy enforcer, bringing actions based on statutes like the Children’s Online Privacy Protection Act (COPPA) and the Fair Credit Reporting Act (FCRA), as well as under the agency’s “deceptive” and “unfairness” authority pursuant to Section 5 of the FTC Act. The FTC also pursued several policy initiatives, including a workshop on cross device tracking (see the TUNE recap here) and the release of agency guidance on native advertising.
One of the more significant cases involving the app ecosystem was the FTC’s December 2015 actions against two mobile app developers for violations of COPPA. These are some of the first enforcement actions based on the sharing of technical or “persistent identifiers”—such as an IDFA or an IP address—between an app developer and an ad network. With these actions, the FTC has established that persistent identifiers such as an advertising ID or IP address, constitute “personal data” – the collection, use, or sharing of which triggers COPPA liability.
Changes in EU Data Protection Rules
2015 saw some major revisions to important privacy requirements in Europe.
In October, the European Court of Justice invalidated the U.S.-EU Safe Harbor Framework, the primary means by which companies transfer personal data for commercial purposes from the EU to the U.S. (more in this update from the Hogan Lovells privacy team). The U.S. and EU regulators have until the end of January to decide on a new U.S.-EU Safe Harbor framework that will be compliant with the requirements set out in the Court of Justice Opinion.
In December, after nearly four years of negotiation, the European Commission, Council, and Parliament agreed on a revised EU data protection law or “GDPR” that would impose significant obligations on companies collecting data from the EU end user. Unlike current EU law under the 1995 and ePrivacy Directives, the GDPR is a regulation that will come into force without the need for additional, implementing legislation by EU member states. It will become enforceable as EU law sometime in 2018. That means you still have time to assess the potential impact of GDPR requirements on your business.
Here are four of the major developments that you should be thinking about:
1. Pseudonymous data is now personal data, forever.
Under the GDPR, the definition of personal data has been broadened to include technical identifiers like ad IDs and IP addresses. This brings EU law in line with current FTC thinking (as discussed earlier in this update). The new law also has specific requirements for companies that store data in “profiles”. Furthermore, the GDPR treats personal data as always retaining its personal nature—even after it has been hashed or “pseudonymised”. As a result, the consumer data protection rights that traditionally applied to personal data under EU law (e.g. notice, opt-out, deletion, retention) will now apply to hashed data.
Requiring that data be hashed is already an emerging best practice for storing or transferring data used for marketing purposes. So, the industry should work together to articulate and develop a common standard for securing pseudonymous data, particularly as the GDPR provides for industry “codes of conduct” to address these types of issues. We already see some great work in this regard from organizations like IAB UK and the Future of Privacy Forum and look forward to even more discussion in 2016 (note: TUNE works with both organizations; we are also a member of FPF and sit on their advisory board).
2. Increased liability for data processors
Under current EU law, data controllers who determine how data will be processed retain primary liability for data protection violations. Data processors like TUNE, who process data at the request of data controllers, do not have primary liability assuming they follow certain requirements (usually memorialized in a “data processing addendum” between the controller and the processor).
However, the GDPR will increase liability for data processors. There is the potential for joint and several liability for data violations such as unauthorized access or a data breach. In some cases, a data processor could find itself primarily liable for data violations—especially if it is found that deficiencies in processing led to the violation in question or if it is found that the processor acted more like a data controller in a particular instance. In addition, data processors must demonstrate “Accountability.” And all of this becomes even more significant, when you consider that under the GDPR, regulators can assess penalties of up to 4% of a company’s global annual turnover.
3. Consent is required for most types of “profiling”
The new EU law requires that you get an individual’s consent before setting up any sort of profile on that user (or their devices). The only exceptions to this rule are for crime prevention and detection. Although a very unworkable standard of explicit consent was discussed in earlier drafts, the final version of the law provides for a standard of “unambiguous” consent. It remains to be seen what that level of consent will actually be when it comes to analytics, or any other type of big data activity. However, this is yet another question that industry can hopefully answer through a code of conduct or similar stakeholder process.
The GDPR will include a children’s privacy law similar to the U.S. COPPA law, although it is unclear what the specific requirements will be. The GDPR establishes the age of consent at 16 but individual EU data protection regulators can lower this to age 13 (which is currently the age of consent under U.S. COPPA).
The GDPR will greatly impact how companies do business with EU citizens. Many of the requirements—around consent, the treatment of pseudonymous data and children’s data—have yet to be determined. However, the new law also anticipates that industry will play a role through codes of conduct, certifications, and other mechanisms. This means that the privacy think tank and industry association communities have a strong role to play as they help companies navigate the new requirements and propose workable standards to comply with the new law. TUNE looks forward to working with our existing association partners—the Future of Privacy Forum, the eDAA, and the Privacy Law Salon—on some of these efforts in the new year.
What’s Your Game Plan in 2016?
With all of this in mind, it’s a good time to start thinking about how you plan to up your game in 2016. A new year brings opportunity to re-engage users and re-engineer compliance, especially with new GDPR requirements looming on the horizon.
At TUNE, we’ll be continuing our work to educate and inform in 2016 through our newsletters, blog posts, and other data and privacy focused events. We’ll also be reaching out to see if like-minded companies are interested in partnering with us on an education campaign about the ad-funded Internet, how it works, and the benefits it provides (most notably, access to services like Gmail or Facebook). We believe that much of the privacy concerns about online data collection can be addressed through targeted efforts that are focused on this type of end user and stakeholder education.
Are you interested in working with us on these efforts? Please drop us an email, we’d appreciate hearing from you.
All the best for a very happy and prosperous 2016!
Note: This piece is intended to showcase my views on certain privacy happenings in 2015; it is not intended nor should it be construed in any way to be legal advice. Thanks for taking a read :-).
Like this article? Sign up for our blog digest emails.
This post was provided by a guest contributor. To check out posts by our most frequent authors, subscribe to our blog.