To get the full value out of a relationship with a data management platform (DMP), you want to provide the platform with as much data as possible. That said, the low hanging fruit in any organization will be to integrate 1st party data for which you already have a cookie to the DMP. The mechanism to accomplish this is your standard cookie sync,which passes a user ID from one system to another via a query string appended to a pixel call, and ideally, a server-to-server integration after that.
Practically speaking this means that when a user hits your site and calls your site analytics tag, either independently or through a container tag, that site analytics tag redirects the user to the DMP, and simultaneously passes the site analytics user ID to the DMP. When the DMP receives that call, it cookies the same user and also records what the site analytics user ID is. Now the DMP knows how to associate data from the site analytics tool to its own cookie ID. The beauty of this system is only the user IDs need to be synced at this time, and the actual data that the site analytics tool records can be passed to the DMP later, without slowing down the user experience on site. Now imagine replicating this process with all 3rd party tools, and syncing all systems into the DMP.
In the example below, a client is integrating their analytics system, order management system, and email system to a data management platform for the sake of selling some of their data on a data exchange (other) and remarketing on the exchange through a DSP. For sake of clarity, this particular setup shows how the process works when the DMP client is using a container tag solution to control which tags serve on the page. Using a container tag or other tag management solution is considered a best practice to ease implementation of new tags, and control tag frequency, among other reasons.
The End-to-End Server-to-Server Integration Process
When the user (smiley face) lands on their site, they request the page HTML from the content server (1), which responds to the user with the page content, as well as the data management platform’s container tag (2). While the rest of the page is loading, the container tag forces the browser to call the DMP (3), which responds back with a batch of redirects as well as its own cookie for the user (4) to each of the integration points. Those are basically pixel requests so the systems can drop a cookie on that user. The user makes those parallel requests (5), and receives the cookie, along with a callback to the DMP (6). That callback has each system’s unique ID on that user in the URL (ideally, encrypted), which the DMP receives and stores (7). Now the DMP has its own cookie ID (from step 4) synced with each 3rd party system.
Now, the data management platform can pull data from each system (8) and populate that data against its own cookie ID. The client can do any audience profiling and segmentation in the DMP, create the right cookie pool and move that cookie pool to other systems on demand, using server to server API connections.
Server-to-server integrations are important because they address the issue of data loss between systems. The alternative is a pixel-to-pixel sync, requiring one system to pass all relevant data in the redirect from one system to the other, usually in some type of key-value string appended to the redirect URL. This added data makes the call heavier and slows down the user experience as well as the syncing process. It will also always limit the amount of data you could ever pass between systems. Slower always means discrepancies and discrepancies mean data loss. In fact, many people in the industry quote up to 30% loss between systems with a pixel-to-pixel integration, which is just enormous. Imagine reducing your data’s scale by a third just to move it from one system to another! In short, if the data management platform can’t facilitate server-to-server integrations, they probably aren’t worthy of the label.
Read More: Syncing Offline Data to Your DMP