PMG Digital Made for Humans

Google's Causal Impact: Part 1 - Intro

3 MINUTE READ | April 4, 2016

Google's Causal Impact: Part 1 - Intro

Author's headshot

Preston Smith

Preston Smith has written this article. More details coming soon.

Incrementality. It’s one of the toughest things to tackle in digital analytics. “How many incremental conversions did that branding campaign drive?” “How many additional clicks did we drive by implementing that new Adwords feature?”

The most common solution is a simple lift analysis: calculate the difference between the average conversions before and after the intervention. This method falls short of the mark because it assumes all differences between performance before and after the interaction are due strictly to that interaction. Seasonality is something that can throw a wrench into this plan. Even those that try to account for seasonality by performing the lift analysis using a forecast using the before data to the actual after data can fail to account for other business factors such as unforeseen press or promotions.

Google set out to help digital marketers with this problem. The result was the creation of the Causal Impact R package. While the inner workings are extremely complex, the overall idea is simple. You submit your response variable, conversions, clicks, etc., over time for a test and control group. (Note, this will require you to have a test group that was exposed to the interaction and a control group that wasn’t.) Causal Impact uses the control group to forecast where the test group would have ended up had the interaction not taken place. (This forecast is more accurate as the control group would have been exposed to any of the unforeseen business factors mentioned earlier.) It then calculates the cumulative difference between the forecasted and actual test group values to determine the incremental lift of the interaction.

To demonstrate, I’ll run through the simple example that Google put together and that every blog post on the package uses. We generate random dummy data and then introduce a substantial “interaction” at some point.

Default Causal Impact example. Shows actuals against forecast, pointwise difference, and cumulative difference.

The default plot Causal Impact spits out does a great job of showing the actual test values against the forecast, the pointwise difference between the two, and finally the cumulative difference, or the lift. This plot does a great job of visualizing what is happening in the analysis.

Stay in touch

Bringing news to you

Subscribe to our newsletter

By clicking and subscribing, you agree to our Terms of Service and Privacy Policy

Causal Impact is extremely useful when trying to show incrementally. The logic behind the methodology is sound and the intuitive visualization gives analysts an easy way to explain/demonstrate the results. We have already used it for a couple of analyses and have created a Shiny app based around this package so our marketers can run this analysis whenever they want. Unfortunately, nothing is perfect. In part 2 we will take a look at one of the issues we have seen when using this tool in the real world.


Related Content

thumbnail image

AlliCampaigns & Client WorkData & TechnologyStrategyDigital MarketingCompany News

PMG’s Predictive Dashboard Wins Innovation Award

1 MINUTE READ | September 28, 2021

thumbnail image

Consumer TrendsSocial MediaPlatforms & MediaDigital Marketing

What You Need to Know About Facebook’s Latest Content Transparency Reports

4 MINUTES READ | August 30, 2021

thumbnail image

Consumer Trends

The Road to Recovery for the Travel Industry

5 MINUTES READ | November 19, 2020

thumbnail image

EMEA Search Trends Amid COVID-19

8 MINUTES READ | April 28, 2020

thumbnail image

A Permanent Shift Into Retail Media

1 MINUTE READ | April 23, 2020

thumbnail image

Social eCommerce is The Darling of Cyber Weekend

4 MINUTES READ | December 2, 2019

thumbnail image

Working with an Automation Mindset

5 MINUTES READ | August 22, 2019

ALL POSTS