Author: Ben Kneen

Data Management Platform: Centralize Your User Data

A critical component of any data management platform is the ability to centralize your audience data from multiple systems into a single interface.  They do this through a NoSQL database management system that imports your data from multiple systems using a match key between each system that they form via, what else, a cookie sync.  It sounds complicated but it isn’t.  Let’s take an example from the marketer side to explain the concept.

Identity Syncing and the Data Management Platform

Say you run a large eCommerce store and want to create audience-based marketing campaigns around different customer groups.  You send a weekly newsletter with a few hundred thousand users signed up, you have a site analytics tool, you have an order management database, or other CRM system, and you buy media through a Demand Side Platform (DSP). Each system fulfills a specific business need, but generally speaking operate in parallel and do not talk to each other. So there’s no way for you to specifically target users on your DSP that are also signed up for your newsletter, or who are signed up for your newsletter and have also visited three or more pages in the mystery novels section of your site in the past 30 days.  You have a site analytics cookie on the user’s machine, but no newsletter cookie, and even if you did, how do you know how to identify the same user in both systems?  In order to get your newsletter system to talk to your site analytics system and push that information to your DSP for future media campaigns you need to find a way to identify the same user between systems.  This is where the data management platform comes in. (more…)

Data Management Platform: What is a DMP?

If you’re working in digital advertising today and not losing sleep over your data management strategy (or lack thereof), climb out from under your rock and join the rest of us trying to figure out how to leverage the mountain of consumer intent and behavior collecting on the  doorstep each day. From both the marketer and publisher perspective, data isn’t the problem, access is the problem.  Each party has access to vast amounts of data, either directly or through 3rd party channels, but centralizing, organizing, analyzing, and segmenting are very difficult for all but the largest companies.  Unless you have a pedigreed team that speaks SAS and Oracle, understands how to use an IBM supercomputer, or has a team of PhDs on the payroll, building your own solution to this problem just isn’t realistic.  It just doesn’t exist in the DNA of most advertising companies today, at least not yet. (more…)

Managing Data Leakage

Like any issue, the first step in managing data leakage is admitting it is a problem and understanding how it happens.  It sounds obvious, but getting a large organization to commit to implementing a data leakage policy at the potential cost of ad network revenue and upset clients and you have your work cut out for you.  After you have buy-in from the internal stakeholders and understand it from a technical perspective, you can start to craft a policy around controlling advertiser access to your audience.  The below would be my recommendations on how to setup a foundation policy on managing data leakage from the publisher side.

First, as a policy, you should prohibit advertisers from dropping cookies on your users – it is a business liability and may even potentially violate your privacy policy. This will address the primary data leakage channel but is also the toughest internal hurdle to clear, because it will likely anger your clients so be prepared for a potential fight with your sales organization. They may resist anything that will anger clients (and they will probably be angry – after all, they most certainly don’t want to gravy train of free data to end), but once you explain that this technical issue is selling against them by commoditizing the site audience, they will probably get on board. You’ll also need a way to enforce this policy, which can be tough if you are lean on development talent. Larger publishers usually have some programmers handy who can create an internal solution to address the problem, and a few of the SSPs out there have also devised solutions to monitor ad tags for cookie dropping through randomly sampling. That is to say, they have a bot call every ad tag on a regular basis and see if any cookies are dropped via that ad call. Some Ops people may ask, ‘why not just manually check for cookie-dropping during manual QA, before the tag goes live?’  The reason just one look at the tag isn’t enough is because it is a well known fact that advertisers may not get around, or perhaps wait by design to adding piggybacked pixel fires to their ad calls until after their tag is up and running. That means you might not catch the issue during normal QA and need to have a way to monitor every third party ad call on your site on a daily basis.

Second, look into a tag management platform or data management platform (DMP) such as Demdex, RedAril, or Krux, to help blind the pixels you know about and do want to fire on your audience.  The reason for this is even if you prevent cookie-dropping, plenty of data exchanges and data collection companies can still use javascript to scrape page content, record page URLS, and semantically categorize page content. It won’t be as useful without a cookie, but it is still valuable.  Even worse, you may be inadvertently sharing user-level information in your URL string, as many of the social media companies were found to have done in an article in the WSJ in 2010.   This will be much more difficult if the only referring URL they can see is from your ad server or a third party company managing those pixel requests. At the very least consider adding a universal container to your site so you can control the pixels on site through your ad server. The major data management platforms will offer this service as a standard part of their offering, but there are ways to create one yourself though an additional iFrame based ad call placed in the header. The benefit of a universal container is you won’t need to rely on your IT department to add and remove pixels from the site, or worry about getting on a site release schedule. In general, DMPs will allow you to offer an alternative and safe way for advertisers to access your audience in a way that you can control (and get paid for).

Third, find out what your advertisers want and how you can add value. If the advertiser wants to retarget your audience, why not offer an audience extension product on your own and retarget the audience yourself through the ad exchanges? Both SSPs and DSPs have ways for publishers to productize their audience off-site and keep themselves in the value chain, grow share of budget, and offer advertisers the expanded reach and frequency they want to achieve. That auto and career sites in particular pioneered the publisher-powered audience extension model, so look to those companies as a model for your own business.

Data can be a business liability or a major opportunity for publishers who choose to manage their destiny. Tools and partners exist to help you and in most cases, advertisers will be happy to work through the channels you enable.

Read the other articles in this series –

Part I: A Primer on Data Leakage for Digital Publishers

Part II: Audience Analytics Lights the Data Leakage Fuse

Part III: The Cost of Data Leakage

 

The Cost of Data Leakage

If you are a publisher that depends on advertising dollars to fund your operations, data leakage is a critical threat to your bottom line.  If you remember nothing else from this post, remember this – data means audience, and audiences are what advertisers pay to reach.  If they can reach them without  buying expensive content adjacency, they will.

Reaching a specific audience used to be hard.  Really hard.  That’s not to say you couldn’t buy it – any number of vertical publishers were happy to sell you millions of impressions if you wanted, but needed deep pockets and what advertisers want most after reaching a target audience  is to scale it to the hilt for the lowest possible cost. Anyone who doubts that can look toward the meteoric rise of ad networks and programmatic exchange buying, which has rocketed to a double-digit chunk of the display industry spend in just over a year.  Cost is a major factor in driving that.

That’s not to say expensive sponsorships and content adjacency are stupid or a waste of money, far from it – but content adjacency is usually a proxy for an audience, reached at scale in an operationally efficient manner, in the right frame of mind to drive brand awareness and brand recall.  Splashy sponsorships and content adjacency are what we call top of the funnel strategies, and they are expensive because it is incredibly difficult to attract a large audience looking to research a certain brand of car, or an HDTV, or their 401K allocation.  Vertical sites can charge a premium because it is not easy to build a deep, engaged, and reliably large audience. Advertisers are very aware of this.

By allowing advertisers to cookie users via pixel fires out of an ad tag, publishers are enabling their clients and ad network partners to remove them from the value chain.  If an advertiser can build a cookie pool on a publisher’s audience, it can readily retarget that audience for a much lower cost on the ad exchanges by using either a DSP or an Ad Network.  From the advertiser perspective this is a great way to extend reach, lower costs, and drive ROI.  The benefits are so great that it would seem absurd not to try, as if the publisher had simply left an unattended briefcase full of money outside the agency’s door.  Publishers without a way to secure their data are pretty much asking to have their audience filtered away from them.

A cookie pool on the loose has a number of negative impacts – first, it erodes the value of the publisher’s audience by allowing advertisers to access it through cheaper channels.  Publishers make enormous investments in technology and quality editorial to attract their audiences, which eventually becomes a competitive edge.  There is a long list of vertical publishers that have cornered the market in their chosen topic over years of hard work, and a marketer willing to pay premium CPMs to reach that audience is the reward.  If the advertiser doesn’t need the publisher to reach that audience any longer, that audience is suddenly worth less.  The audience is everywhere, on thousands of sites.  It is no longer Publisher X’s Unique Audience, it is CookiePool123, it is a commodity.

Finally, from a technical perspective, data leakage potentially exacts a huge cost on your site’s user experience through page latency.  All those third party ad pixels take time to execute, and in many cases may not work through an iFrame tag, meaning they must finish before the page content can continue to load.  At 20+ms for each call in addition to the time it takes for your ads to load, it doesn’t take much to make for a sluggish site from the user perspective. Anyone will tell you slow pages degrade almost every major site metric, not to mention can have a significant impact on SEO rankings.  Chew on that for a bit!

So what can a publisher do?  Read Next – Managing Data Leakage and find out.

Audience Analytics Lights the Data Leakage Fuse

As data collection started to take off on the advertiser side, companies like comScore and Nielsen were simultaneously trying to do more and more to build a story around demographic behaviors online, which is a huge challenge because of how inherently fragmented the internet is vs. traditional media.  The standard model for those traditional media measurement companies is to set up a small panel, reach statistical significance for a few major publishers, and extrapolate the results into official ratings.  The smaller the panel, the wider the margin of error, but typically they can get within a few % points with just a few hundred people.  Sounds great, but on the internet there aren’t a handful of major networks with distribution to every home in America, there are a hundred million destinations with a wide range of viewerships, some quite minuscule, so sampling with a small group is impossible.  The result is a panel approach doesn’t really work all that well, as evidenced by the typically huge discrepancy in what publishers report as the number of unique visitors per month and what comScore might tell you.

Then, in 2006, everything changed when a few really smart guys founded a visionary company called Quantcast.  Quantcast’s history is critical to understand as part of the data leakage story because they were the first company to really get people thinking about the value of audience data and then make that audience data actionable.  While they didn’t invent the mechanism to build an online audience, they were the first to figure out how to build a system that could algorithmically tie demographic information to a specific cookie, keep the data current, and scale that service at scale on hundreds of millions of cookies. They accomplished all this by directly measuring audiences instead of using a small panel of users.  This is a standard methodology today – every data exchange currently in market (BlueKai, AudienceScience, eXelate, and others) relies on having a redirect to their cookie sitting on thousands of sites to help them build a cookie pool they can profile, but no one was doing it in 2006.

Quantcast was able to pull off direct measurement of audiences on sites they didn’t own through a unique data-sharing proposition to publishers – put a pixel on your site, allow Quantcast to cookie your users and they would give you demographic analytics on your site audience – free.  Or rather, in exchange for surrendering your rights to any data Quantcast could collect.  Publishers big and small signed up in droves and in a short while, Quantcast was measuring tens of millions of people on tens of thousands of sites.  This mountain of data allowed them to do really sophisticated audience modeling and infer demographic and psychographic characteristics at a cookie level.  After building a unique audience profile on each cookie, they could aggregate that data for the unique cookies on any given publisher and report accurate demographic profiles for any publisher.  It was the reverse of traditional media measurement – publishers contributed thousands of data points on each user rather than users contributing data points on the publisher, but this solved the fragmentation problem.  By building confidence at the cookie level, Quantcast could simply re-purpose the data they had on a cookie for whatever group they saw on a tiny publisher.  You could have a site with thousands of visitors a month instead of tens of millions and still have the same extremely accurate demographic reporting.   It was pretty slick stuff and so effective that eventually it forced comScore and Nielsen to start doing direct measurement as well.

Suddenly, people saw the power of the cookie.

So now, you had a company with a truckload of audience data on a huge majority of the US internet population, down to the cookie level, and you had a ton of advertisers looking to get those exact same audience metrics on their pile of cookies.  What a great coincidence!  Now advertisers could use the same technology and instead of (or in addition to) dropping their own cookie on a user, they could drop Quantcast’s cookie on that user and then access the same sophisticated audience metrics that Quantcast had collected from publishers.

While impressive as a new frontier in media analytics, it all came together when Quantcast figured out how to enable ad targeting on their cookie pools.  Through a simple tag integration with an advertiser and publisher account, Quantcast could actually pass a key-value into an ad tag and target an advertiser’s ads against their cookie pool.  True, explicit audience targeting was born!  The best part was that advertisers didn’t have to build that cookie pool on any specific publisher in order to target against it.  Advertisers could cookie the audience on a premium site and then target that audience on another cheaper site.  In fact, you didn’t have to build a cookie pool at all, Quantcast would just sell you a cookie pool with your choice of demographic data if you wanted.  You could even get really fancy and take a small, directly cookied audience and scale it up exponentially with a statistical correlation against the Quantcast database Quantcast called look-alike modeling.  Basically, Quantcast was able to look at a small group of cookies, figure out what was similar about them, and then find other cookies not explicitly cookied by the advertiser to scale that audience into a much, much larger cookie pool.  Again, this is a standard offering with most DSPs and data exchanges today, but unheard of a few years ago.  Audiences were now portable, and true audience targeting was born.

To be fair, it’s not like Quantcast held a gun to publishers heads, the publishers readily volunteered access to their data – gave it away even!  They got access to new analytics and pushed media management from a panel based system to a data-driven model.  Small publishers in particular, which comScore and Nielsen wouldn’t have bothered with got a big helping hand from Quantcast’s trusted, 3rd party metrics when trying to sell their sites.  And, if publishers wanted, Quantcast was more than happy to help them productize their inventory by demographic characteristics to sell directly to advertisers.  I’m not sure the digital publishing community ever really got on board with that concept, but the ad networks certainly ran with the idea and used it to differentiate and add value.  The point is that Quantcast isn’t a bad actor in the industry, rather they are an innovator, and trusted by plenty of major media companies on the buy and sell side.  But they had a foundational role in the mechanism that potentially puts a great many publishers at risk of commoditizing their audience.  Publishers have to start paying attention to the potential risks.

Quantcast’s innovations in the media measurement and data management space forever changed the value of data by making it actionable, and would soon spawn a number of competitors that sought to do the same thing.  Eventually, this data management space would collide with the ad network and ad exchange space and throw a bucket of gasoline  on the whole issue of data leakage.

Read Next – Understanding the Costs of Data Leakage