By Gareth Jones - Chief Operations Officer, Headstart AI
Gareth will be speaking at People Analytics World 2018 this April, London - meet him there.
They say that with great power comes great responsibility, and nowhere does this seem to be panning out more than in the world of personal data. Unless you have been in prison or sunning yourself on a remote island in the middle of nowhere recently, you can’t have missed the Cambridge Analytica ‘scandal’ that went nuclear in the last couple of weeks. Rarely do stories like this go from broadsheet to the global news networks so quickly, and rarely do they have the power to wipe $75bn - yes that’s billion - from a company’s market cap. Stories around the subject of personal data privacy also rarely reach these levels of awareness, except perhaps those that involve large data breaches or hacks, and even then, there are so many they largely go unnoticed. But this one did. When it comes to politics and some dodgy goings-on at one of the biggest data companies in the world - Facebook - people tend to sit up and take notice.
Central to all the fuss is a process called ‘political microtargetting’, which essentially allows political campaigners to laser-target voters whose allegiance could be shifted from one party to another, and then use super-personalised targeted marketing messages - through social media and other online channels - to influence them to change their voting behaviour. Cambridge Analytica have emerged as the front runners in recent years in exploiting this technique, and have been involved in many campaigns, including most recently Trump and Brexit. However, it is not new, having also been used in several senate campaigns and the Obama campaign too. This strategy has been alive and well for a number of years.
Our ‘digital footprint’, including our social media presence, is the core source on which Cambridge Analytica builds its data models. Again, this is not new. Many of us have knowingly, or maybe unwittingly, traded our social media data in exchange for access to apps or other online interventions. If you have ever completed a personality profile via Facebook, tried that app that shows you what you would look like at the age of 70, or tells you what political figure you are most like, then you have exchanged some or all of your Facebook profile data in return for that hilarious picture of you as a pensioner that you then shared with all your friends. Who then go and do exactly the same. Rinse and repeat and hey presto, whoever created the ‘app’ now has a rich source of data. Whilst some might be uncomfortable with this approach, there is a value exchange there, whether we fully understand the implications or not, and of course, you have agreed to provide access to your data when you signed up to the app.
The controversy comes from the fact that Cambridge Analytica appears to have been ‘harvesting' people’s Facebook data without permission, effectively ‘scraping’ our personal data and using it to build their models. Whilst that controversy rolls on, what are the implications for the People Analytics industry, how does it work, and why does it matter?
Well, often it’s not the result itself that’s the problem, but the way in which it’s achieved. And in my view, that’s exactly the problem here. This subject is very close to my heart as I have been researching this methodology for the last 7 years, and using similar techniques in my work to assess potential in human beings. I first came across the potential of social ‘big data’ to provide insight into people when I discovered a hedge fund in the UK had been using sentiment derived from Twitter data to drive investment decisions. This led me to the World Wellbeing Institute who, with the help of Cambridge University, ran the world’s largest language and personality study globally. In short, your digital footprint says an awful lot about you and can be very accurate in predicting a wide number of things about you as an individual, including things like your personality. This combined with other indicators provides a very powerful data set which can be built into a model and used to influence others, be that in terms of voting behaviour or more commonly, what brand of shampoo to buy.
Here’s a simplified version of how it works:
- Identify a group of people on social media and get them to take a standard form assessment that measures one or all of personality, behaviours and motivations.
- As part of that exercise, get access (with permission of course!) to their social media profile. In this case, Facebook.
- Data scientists and psychologists then put the two sets of data together and look for a ‘pattern’ that correlates certain traits to other indicators. A simplified example: Extroverts like Game of Thrones. (Derived from an Extraverts Facebook Likes.)
- Using this model, you can then scan other individuals and, assuming the correlations are strong enough, ‘predict’ that if someone else likes Game of Thrones, they are highly likely to be an extrovert.
As I said, a very simplified version. In practice, companies like Cambridge Analytica build very sophisticated models and smash together many other data sets, including shopping and credit data. Now, once you have the basic model, you can then extrapolate from that model across wider populations. Cambridge Analytica claimed that their ‘psychographic profiles’, as they called their models, could help them predict the political persuasions of every single US citizen. And, ultimately, work out how to influence them.
As you can imagine, lots of scope there for misuse, and when you add in illegal sourcing of data, it’s no wonder that the world has suddenly sat up and taken notice.
Fundamentally though, there is nothing wrong with this model. Using your digital footprint to identify certain things about you can be very powerful - for you. Used appropriately, it can put the power of data in your hands and fundamentally shift the relationship we as individuals have with data. It can give you insight into yourself that you can use in many powerful ways, including understanding yourself better, getting a better or more appropriate job, identifying activities that may be harming your health, and ultimately, allowing others to understand you.
We are on the brink of a personal data revolution and only really seeing the power in the exhaust that comes from the digital and social technology some 10 years after it became part of our everyday lives. Nowhere will this play a more important role than in People Analytics, as the HR function comes of age and starts to drive real value into the business as we start to fully understand people, just as our organisations have been doing with customers for decades. Unfortunately, where there’s a Yin, there’s a Yang. As we take one step forward exploring this potential with the best intentions, there are always others like Cambridge Analytica that abuse the privilege and take us 2 steps back.
I sincerely hope that the fall-out from the recent scandal drives more positive change in the world of data privacy and pushes more control into the hands of individuals. But I also hope that the backlash will not destroy the potential of these data models to do good, and to transform the landscape of People Analytics. Without access to data, there is no insight. Without insight, the HR function can add no real value.
If you are interested in exploring this topic more fully and want to engage in the conversation, then get along to the PAWorld18 where I will be chairing the Disrupt stream during the two-day conference, and also leading a panel conversation on how People Analytics and HR can leverage more innovation through start-up technologies.