×

Overcoming the Challenges of Attribution Model Progress

attribution

The industry is largely agreed upon the weaknesses of last-click attribution, yet it remains in common use. Below, Rafael Garcia-Navarro, Experian Marketing Services, head of analytics, discusses overcoming the challenges of finding out what marketing activities do and don't work.

While the rest of the digital arena has advanced at a rapid pace over recent years, attribution is one area that has stayed relatively static. Despite all the hype in the industry around attribution, and the inadequacy of the models currently in use, progress has been slow and we are no nearer to finding a scientific-based approach that uses advanced statistical techniques to identify the actual contribution of digital activities.

With the complexity of the customer journey increasing, taking a simplistic view of attribution does not help when you are addressing it. The evolution lies in the use of science to measure more accurately, not only the influence of the channel, but also to take into account how channels are interconnected.

The most commonly used attribution models are either click- or rules-based; the former include first click and last click, and the latter include even distribution, time decay, and positional. All these methods are subjective and have significant flaws, but their main attraction is the ease of implementation.

Existing model limitations

If we look into the evolution of digital marketing since its inception in the late 90s – with the development of real-time bidding (RTB), mobile, and others – and compare it with the stagnation of the platforms and methodologies used to measure them, we can see a clear disconnect emerging.

Clicks are not fully representative of the value of a channel, and by refusing to acknowledge this a deception gets perpetuated in the form of inadequate measurement.

Buying ads on a fraudulent site, for example, usually comes very cheap (impressions for 1-5p CPM) and have good click-through rates (CTRs above 1%).

So, if you have fraud in your advertising mix, what you see as an advertiser is that for a small amount of money you get a good number of clicks. When you remove these fraudulent clicks, it can look like traffic is being bought on more expensive sites with worse click-through rates, as the marketing key performance indicators (KPIs) will often go down.

This is why in display advertising it is very common to see 75-80% of conversion driven by view-through as opposed to click-through. A click-centric measurement model will prevent businesses from capitalising on the full potential of their marketing investment, by distorting the actual performance.

Overcoming challenges

There are a variety of factors that contribute to the current crudeness around the measure of digital marketing attribution. The first of these is a shortage of technical skills to employ big data technologies to manage the complexity around processing and transforming extreme volumes of hyper-structured data (web log files) to enable advanced statistical analysis.

It’s worth clarifying the term hyper-structure – weblogs are sometimes, wrongly in my opinion, referred to as unstructured data. The reality is that once you look into the details of how the data is generated by digital platforms, it becomes apparent that the structure is there within the data itself, and it can be derived though string parsing it. It’s not a structure in the traditional BI sense of columns and rows, but nonetheless, structure it is.

As a result of the above, the technical skills required to manage this complexity are not as widely available as the traditional analytical skills used for the widely adopted, but inapt, current measurement methodologies (e.g. last-click, first-click, etc.)

The second factor is the difficulty of striking the right balance of expertise between business acumen and the technical understanding of how digital channels and tracking platforms form part of the same ecosystem. The ability to bring together the business dimension with the actual implementation of the marketing objectives into digital channels and its tracking platforms is a crucial requirement to enable advanced algorithmic attribution.

The third, and final, key factor is the data science talent paucity to establish the link between the business objective of understanding the true contribution of marketing activity, taking into account the interrelationships across channels, and the statistical methodologies required to get to the desired outcome.

This is a challenge that will persist for the time being, and organisations whose core competence is around data and analytics will be better placed to attract and retain such an elusive talent. Experian Marketing Services has heavily invested over the last eight months to address the areas above.

Moving to multi-touch attribution

Statistical modelling is the place to look to begin the further development of attribution. The statistical modelling process produces two key outputs – the coefficients to interpret from an aggregated perspective the contribution of the channels, and also the ability to produce probabilities at the user level (user scoring) for a desired outcome – e.g. propensity to convert for those users who have not done so yet.

The scoring of those non-converters is a very effective approach to bring predictive modelling into the retargeting world, and this process is focused on statistically identifying the actual contribution of the different channels. The ultimate benefit of multi-touch attribution models is the ability to work out the actual ROI of digital investments, and optimise the channel mix accordingly.

Organisations should always aim to have robust measurement frameworks to make the most effective business decisions and attribution is no different in this respect. Any paradigm that allows brands to reduce uncertainty in decision-making should be pursued and statistical modelling for algorithmic attribution certainly falls under this ‘uncertainty reduction category’.