GET INVOLVED     |     ISSUES     |     NEWSROOM     |     RESOURCES     |     ABOUT US     |     CONTRIBUTE     |     SEARCH  







Privacy vs. personalization: can advertisers ward off looming threat of do not track list


By Thomas Claburn

Information Week
November 10, 2007


To hear consumer privacy groups tell it, online marketing is the force behind both the childhood obesity problem and the subprime lending crisis.

The Center for Digital Democracy and the U.S. Public Interest Research Group laid these and other social ills such as racial profiling at the feet of online marketers at a Federal Trade Commission meeting two weeks ago and urged the FTC to investigate and regulate online marketing. The advocacy groups singled out Pepsi, General Mills, and MasterFoods USA, part of Mars, for targeting the youth market with online ads in a complaint filed with the FTC. They also documented the extent to which some of the lenders caught up in the mortgage meltdown were among the biggest online advertisers.

Their crime? Behavioral advertising--tracking what a person does online and using that knowledge to present relevant ads.

One person's relevant ad apparently is another's manipulative marketing technique that makes use of personal data in ways that compromise privacy. "The right hand of online marketing continues to hide behind the myth of anonymity, even while the left hand of Web analytics constructs remarkably detailed mosaics out of innumerable shards of purportedly 'non-personally identifiable' information," the groups said in their complaint.

The charges reflect growing concern from privacy advocates about the reams of personal data online marketers of all stripes are compiling, how they use that data to personalize communication with consumers, and what the advocacy groups say is a failure of self-regulation to put controls in place to protect personal data and preserve privacy.

The call for regulation is getting louder. At the FTC meeting, a coalition of privacy groups went so far as to propose a "Do Not Track" list, similar in concept to the FTC's "Do Not Call" list, to let consumers opt out of behavioral ad targeting.

Talk of a Do Not Track list is a clear red flag that it's time for online marketers to address consumer concerns about privacy and manipulation. If they don't, others--regulators and legislators--could step in.


Some companies are already taking the issue seriously. AOL recently launched a program to educate consumers about online behavioral targeting and said it would provide access to improved technology for opting out of personalized ads. "We want to make the opt-out process as simple and transparent as possible," says Jules Polonetsky, AOL's chief privacy officer.

Google, Microsoft, and also are changing the way they handle usage data. Google has pledged to make the data it collects anonymous after 18 months and to expire cookie files after two years. Microsoft has come out with a privacy policy that gives users more control over the data it collects on them. Ask has gone even further, pledging to anonymize Web searches and cookies after 18 months and to give users the ability to delete their search histories using a tool called AskEraser.

But thanks to ad networks like DoubleClick, aQuantive, and AOL's Tacoda, as well as installed browser toolbar software, companies now get unprecedented insight into consumer behavior across the Internet, rather than merely at a single site. Consumers are often complicit in the data collection--willing to exchange personal information for free services, easy credit, discounts, upgrades, points, and rewards of all sorts. For example, many Web users agree, deliberately or accidentally, to let companies like Amazon and Google record their purchases and searches in exchange for personalized product recommendations or improved search result relevance.

Data tracked can include technical information about the person's browser, operating system, other software, and hardware; browsing behavior; past visits to a site; geographic information based on the user's assumed or declared location; content preferences; and just about any other detail that can be associated with a user or derived from his activities.

Next steps include tying in data from mobile devices to track users' physical whereabouts and target ads accordingly. And social networks are offering up a wealth of personal data contained in users' profiles, activities, and communications. Companies like MySpace and Facebook are experimenting with ways to monetize that data and give other companies access to it for ad targeting purposes (see story, "Social Networks Find Ways To Monetize User Data").

The data collected online for behavioral targeting is usually considered anonymous. Cookies and other tracking technology are used to keep tabs on Web site visitors, obviating the need for personally identifiable data such as names, addresses, and Social Security numbers. In isolation, the data seems of little consequence. But in aggregate, it's enormously valuable, letting companies target individuals with tailored ads and create segmented marketing groups with detailed profiles.

It's also this aggregation of the many small pieces of innocuous data that can invade privacy. Today's data mining and analytics technologies make it possible to identify people by cross-referencing information sources that don't permit identification when considered alone. All those little pieces of data can be brought together to create revealing profiles of the people being tracked.

In theory, that's where privacy policies kick in to protect consumers. But companies don't always act in accordance with their policies, and consumers often don't bother to read them and assume privacy to mean something more than the pseudo-privacy typically granted. Moreover, the distinction between personally identifiable information, which gets legal protection, and nonidentifiable information, which gets less protection, is becoming less clear. For instance, anonymized search data can be linked with an individual if the searcher enters a query with his or her name and other data corroborates the identity.


Many companies still operate in a more black-and-white world where data is either anonymous or not. "If behavioral targeting is anonymous, by definition it doesn't raise any privacy concerns," says Peter Fleischer, Google's chief privacy counsel. "If, on the other hand, it's based on personally identifiable information, then we would apply, and I think the entire industry should apply, the core privacy principles" of notice and consent, Fleischer says.

Google has always been reasonably straightforward about the data it collects and how it uses it.

"The way we see it, it's really a partnership with the user," Fleischer says. "As long as we're transparent about how it works, what we're collecting, how it's used, and give the user complete control of that information, we think that respects core privacy principles."

Personalization as practiced by Google is fairly benign. With its Web History capability, the company offers personalized search results to users who have Google accounts and elect to have their search history tracked. For users who make that choice, Google saves information about their searches and the sites they visit and uses it to make search results more relevant to their interests.

But personalization and privacy are getting more complicated for Google as it undergoes FTC scrutiny for its planned acquisition of DoubleClick, an ad network that tracks users across a broad range of sites, making it possible to assemble a comprehensive profile of a person's interests.

"What's coming under question, potentially challenged by some, are these third-party cookies," says Scott Eagle, COO of Claria, an online content personalization company, formerly known as Gator, that has gone from being a reviled company to one that privacy experts say has reformed its practices. "And, of course, with Google-DoubleClick, the fear is that all the cookie information collected by DoubleClick is going to Google," giving it control over vast quantities of information about what people do online.


They may not be doing it at the level of DoubleClick and Google, but the vast majority of companies are doing or are interested in doing behavioral targeting. U.S. companies spent $350 million on this type of marketing last year and are expected to spend $1 billion on it in 2008, according to eMarketer. A 2005 Forrester Research survey of 253 marketers found that 38% were already conducting behavioral marketing campaigns while another 36% planned to begin such campaigns in 2006.

Personalization creates "huge value," says Eagle. "The amount of engagement goes up as much as three-fold."

The value of behavioral targeting depends upon the context in which targeted ads are used, according to Suranga Chandratillake, CEO of video search company Blinkx. There's not a lot to be gained by running targeted ads when someone is watching entertainment like a music video, he says, but a home improvement ad aimed at a viewer watching informational content about home repair might perform as much as 100% better.

The privacy concerns aren't new. There's long been "a push-pull relationship between the people who want to target ads and the people who are looking at the ads and want to protect their privacy," says Chris Sherman, executive editor of Search Engine Land. "We seem to have these waves of concern," Sherman says. When DoubleClick bought offline direct marketing firm Abacus Direct, there was a huge outcry about the dangers of linking online and offline information, he notes, but when Google announced late last month a deal with Nielsen to collect demographic information on TV viewers, little was said.

Beyond an emerging sense of inevitability that more data will be collected and combined, Sherman sees a tentative trust developing between consumers and custodians of data. "We've learned over time that these companies aren't going to abuse the information they get," he says. "When it's done properly, the targeted advertising can really be useful because we end up seeing the stuff that we really want to see and get less of the stuff that's irrelevant."


While consumers wrestle with the trade-offs between privacy and personalization, between tin-foil hat paranoia and data nudism, companies are faced with the challenge of maintaining consumer trust while amassing valuable but potentially damning data. This is not merely a matter of managing behavioral advertising data but of data governance in general.

"If you're going to do personalization, if you're going to use real data about people, and maybe private or sensitive information, you have an obligation to do basic blocking and tackling in privacy or more than that," says Larry Ponemon, chairman and founder of the Ponemon Institute, a research group that focuses on privacy and management practices. "It could be a big reputation problem if you're in the behavioral marketing industry and you don't take stock of privacy and information security concerns."

Gaining and keeping consumer trust at the intersection of online personalization and privacy isn't easy.

"We're entering a place where user-driven content and individuals have tremendous economic power to participate online, to create an economy and to walk away from services that they think are violating their human rights or economic rights," says Michelle Finneran Dennedy, Sun Microsystems' chief privacy officer. "That's where this is all coming to a head."

Dennedy professes faith in technical solutions. She hopes that the growing debate over privacy and data leads to a coherent vision of what consumers and companies can agree on, and that companies like Sun can design products to meet those needs.

A Do Not Track list is one way to give consumers control over their data. But it could be overkill. Certainly, an all-or-nothing approach like the Do Not Call list would put a damper on online advertising spending, which totaled $16.4 billion in the United States last year and is expected to reach $36.5 billion by 2011, according to eMarketer. The list being proposed would be administered by the FTC and would affect only targeted ads. It would require companies that set persistent identifiers--such as cookies--to register their servers with the FTC.

There are already ways to opt out of online ads. The Network Advertising Initiative, a marketing trade group, offers a list to opt out of targeted ads from some of the major ad networks. AdBlock Plus and Customize Google, two Firefox extensions, provide users with a "do not show me ads" option. Internet Explorer's IE7Pro includes an ad blocking feature. And people really into not being tracked can set their browsers not to accept cookies or to delete them, and they can use a proxy to surf the Net. Not using cookies, however, limits functionality on many sites, and many consumers aren't aware of these options or proactive enough to use them.


A Do Not Track list isn't inevitable. With the right security technology and a forthright approach to customers, the hue and cry might just die down. Businesses must demonstrate to customers that they aren't out to do evil--to manipulate or surreptitiously grab data that customers don't want them to have.

One approach is to give customers control over the data being collected. Ask is doing that, as is Microsoft, which is letting customers opt out of behavioral ad targeting conducted by Microsoft's network advertising partners.

"You can't group customers all together. There are varying perspectives," says Brendon Lynch, senior privacy strategist in Microsoft's Trustworthy Computing Group. "Some people are willing to give up all sorts of details for a personalized service, and other people are very wary. So being able to give control to them ... is key."

Another approach is for companies to engage customers in a discussion of their privacy policies. But too often, that discussion revolves around privacy statements and is along the lines of what Verizon did last month when it informed customers that they would have to opt out if they wanted to keep the company from sharing data--other than names, addresses, and telephone numbers--about their network usage (call duration, destination, etc.).

Sharing that information, Verizon says, will let it "better offer and provide" a full range of communications-related products and services. But it's widely believed the company made this change to facilitate plans to deliver ads to mobile phones. A more respectful approach would have been to let customers opt in.

Privacy policies aren't the only way to talk to customers about the security of their personal data. Microsoft also builds privacy choices into the software user interfaces, Lynch says. Windows Media Player, for instance, has what the company calls "just-in-time notice and consent." The first time users run it, they get information on the various privacy controls. "It achieves a customer education goal, but it does so in a more prominent way," he says.

Google, too, is experimenting with alternatives to privacy statements, using video to explain privacy choices to users, Fleischer says.

Beyond providing more active privacy discussions with customers, companies would do well to be more forthright about what data they collect and whether they're aggregating it in ways that could be traced back to individuals. Any aggregated data that potentially still could be traced back to an individual should be subjected to data security policies and practices as if it were personal data. That could go a long way toward reassuring consumers that their personal information is safe.

When it comes to personally identifiable data, the use of thin-client technologies, where the information is stored on the back end, protected by IT professionals, is effective, Sun's Dennedy says. Sensitive information like credit card data shouldn't be stored on retailers' servers, but rather on MasterCard's servers, she says.

Even more effective when it comes to the type of data collected for behavioral targeting is data segmentation, which lets companies manage risk as well as make consumers more comfortable, Dennedy says. Information about consumers can be parceled out on a need-to-know basis using federated identity technology, which dynamically assembles a customer identity from multiple IT systems, and role-based access, which limits the data provided based on the role of the querying entity. A hotel, for example, doesn't need to know a traveler's airline seat preferences and an airline doesn't need to know where a traveler will be spending the night, though the traveler himself can and should see that information in one place.

That's how Microsoft is offering personalization and privacy at the same time, Lynch says. "When we personalize our online services--search and advertising in particular--we segregate personal information from any data that we use to target advertising," he says. Microsoft uses cryptographic techniques to separate that data so that it can't be connected to the specific customer, he adds.

Another company that manages data-driven marketing without privacy complaints is NebuAd. It started a year ago in an environment where all of AOL's search terms had become public and the government was subpoenaing clickstream data from Verizon and AT&T. "We resolved to come up with an architecture where we wouldn't have any risk of any of those unfortunate things happening to us," says CEO Bob Dykes.

NebuAd uses data segmentation to provide behavioral ad targeting without maintaining data about Internet users. It provides proprietary hardware to ISPs as part of an arrangement for ad-revenue sharing and monitors users' interests and provides relevant ads without tracking them in a way that's personally identifiable. Instead, it generates a code called a one-time hash with which to associate user interests. "We can see that same user coming back onto the Internet later, but we have no idea who it is," explains Dykes. "There's really no benefit to us to have more detailed information about the user."

That has real appeal to Tom Soevyn, CEO of direct response agency Focalex, which is running ad campaigns through NebuAd. "Frankly, I don't want my company's name, or any of the companies I work with, getting tied up in someone saying 'Big Brother' or 'Someone's tracking my behavior,'" he says. "They're just looking at data streams. They don't know who that consumer is or anything about that consumer, so that gives me a sort of comfort level."

It's that comfort level that every company engaged in behavioral targeting needs to shoot for. Engaging in ongoing dialogues with customers about the data being collected and how it's used, letting customers control the data that's being collected and the level of privacy they want it subjected to, and showing them that it's securely locked down will go a long way toward avoiding movements that lead to a blanket Do Not Track list.

This article is copyrighted material, the use of which has not been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: If you wish to use copyrighted material from this site for purposes of your own that go beyond fair use, you must obtain permission from the copyright owner












Website Designed & Maintained By: AfterFive by Design, Inc.
CCFC Logo And Fact Sheets By:

Copyright 2004 Commercial Free Childhood. All rights reserved