The Privacy Exchange

Blog

Original Source: Evidon Blog - http://blog.evidon.com/2012/09/04/guest-post-by-turns-joshua-koran-the-privacy-exchange/

The World Wide Web Consortium (W3C) was founded to promote the free flow of information among internet-connected devices.  Historically this organization has standardized the protocols and languages necessary to enable this seamless communication, and it has continued to evolve those standards as the web’s sophistication and ubiquity has grown.  Currently, the W3C’s tracking protection working group has begun debating significant, new mechanisms for users to express a global privacy preference (e.g., Do-Not-Track (DNT)).  While normalizing the web’s approach to privacy could yield significant benefits to both consumers and businesses alike, there is an increasing risk that the standards being discussed could dramatically restrict the free flow of information that the W3C was originally intended to foster.

There is considerable and healthy debate about what privacy policies and user rights should prevail, worldwide.  Which nations’ laws should the W3C codify into technical standards?  Should the standards even attempt to codify national laws?  These are difficult questions, and unfortunately there is also a lack of clarity as to how to measure the success of the W3C working group’s output.  While some advocates argue that the DNT flag should mean “do not collect” for any purpose, others argue that it should mean “do not target.”  Still others ponder how a single flag could function as a useful indicator of a user preference that universally applies to all websites and across all countries.  If it should apply to only a limited set of websites, then what alternate mechanism should users use to express different preferences for specific websites?

A vocal subset of the W3C working group suggests that the interpretation of this single flag should be “do not collect,” even though such an interpretation would have profound implications for website publishers and consumers’ ability to receive free, advertising-subsidized content.  If data cannot be collected at all, publishers would be prevented from measuring the performance and popularity of their websites.  Their ability to sell advertising (which subsidizes their services) would be dramatically inhibited, as advertisers would not be able to measure the reach and frequency of their campaigns.  Even the risk of fraud increases if data were restricted to only a single session or website.  Accordingly, the working group has discussed creating specific exceptions to user- expressed preferences that would allow for website analytics, product/service fulfillment, and fraud detection.

Other exceptions are not as widely agreed upon.  One example is the first-party marketing exemption, since users can choose whether or not to visit a publisher’s website and have presumptive notice of the publisher’s data collection and use methodologies via a posted privacy policy.  For the biggest publishers who have large ad sales and technology organizations this would not change business as usual for users who have set the DNT flag.  But for publishers who utilize any third-party service(s) for advertising fulfillment, this exemption could have a profoundly negative impact.  Many publishers outsource sales and infrastructure activities to third-party vendors who can only provide their services by aggregating data and advertising inventory across multiple entities.  If third parties are unable to provide these services, the loss of that available data will impact the agency/advertiser reporting systems, and the ad exchanges and networks that enable publishers to sell their advertising.

One potential reason behind this approach is that the W3C working group is represented by organizations and individuals who can afford to participate.  Unfortunately this membership does not represent the perspective of all parties in the internet ecosystem.  For example, most publishers can neither take the time to attend weekly, lengthy meetings nor afford to travel to the international meetings.  Consequently the standards being proposed may not be viable for those groups who are not represented.  Yahoo!, for example, recently announced that upon seeing a user-set DNT flag, it would set that user’s Yahoo! opt-out cookie.  But once opt-out rates grew beyond a given level, Yahoo! could initiate a pop-up to notify users that their DNT browser preference is not compatible with receiving free Yahoo! services.  Many users might then exchange their standard preference in order to receive the benefit of Yahoo!’s services, especially since Yahoo! has had over a decade to build its brand.  But how many users would be willing to make this same exchange with a website they had never before visited or a new startup without brand recognition?  This illustrates why the DNT flag would not prevent large publishers from targeting users (by requiring users to opt-in), but could negatively impact smaller or start-up publishers and the third-party services that support them (who haven’t built the brand reputation to require an opt-in).

This gets at the heart of what enables the free flow of information.  Data and the ability to track consumer engagement with content (via online or what gets delivered to your mailbox at home) powers the advertising ecosystem.  It creates free access to a diverse set of ideas produced both by millions of individual bloggers and employees of the world’s biggest publishers.  Everyone enjoys and benefits from free content.

At the end of the day virtually every person agrees that we each have a privacy right embedded in our own identity.  But each of us is also a consumer of advertising-subsidized content, and that advertising relies on the use of anonymous data.  The digital advertising industry’s self-regulatory guidelines strictly prohibit joining this anonymous data to a personally identifiable individual without explicit and informed, opt-in consent.  Transparency and choice are two fundamental principles that underlie digital privacy guidelines.  But how can users make informed choices if they don’t understand the implications of their decisions? The personal and even societal value of having free access to nearly unlimited content is hard to overstate, and the loss of it would have profound implications.

Turn fundamentally believes in the transparency and choice that are the foundation of the current digital privacy guidelines.  We believe the W3C can further improve consumer privacy controls by helping to standardize the way in which these principles are codified in the consumer internet-enabled applications and devices.  But I hope the institution founded to ensure global, long-term access to and growth of the Web recognizes the unintended consequences of the proposed standards and prevents instituting barriers that would disproportionally impact small publishers and reduce the diversity of opinions now supported by the use of anonymous data.  Otherwise, the W3C tracking protection standard might indeed protect less than we thought and prevent more than we bargained for.

Joshua Koran
SVP, Product Management
Turn