Interesting read? Comment and share!
At 10.00 GMT on 7 November, Dr. Andy Charlwood (University of Leeds) and Nigel Dias (3n Strategy) will be hosting a Tucana People Analytics webinar. They will be discussing the background and some preliminary findings to the HR Analytics ThinkTank report they will be releasing in January, which aims to identify, explain and share how organisations are successfully (or less successfully) adopting People Analytics as a sustained business practice.
The report is due to be published in January, and we are looking for People Analytics practitioners (of all levels of experience) to take part in the research. Please click here if you are interested.
If the aim of People Analytics is to improve the way in which organisations make decisions about the workforce, then the aim of our research is to improve the way organisations make decisions about People Analytics. As the field continues to grow - as more organisations start their journey, as others lead the way - what makes some successful and others are not? What are the common obstacles blocking functions from making an impact? Are there different types of People Analytics function - those rooted in reporting versus those in data science - and what do their journeys to People Analytics success look like? Can we analyse which decisions, made when increasing the likelihood of having a meaningful, and sustained impact on the way people decisions are made?
Using a combination of quantitative and qualitative analysis, these are just a few of the more high-level questions and hypotheses we are investigating on - some of which will be discussed on the webinar, and some of which will be investigated during later phases of the research in 2018. We are interviewing as many People Analytics leaders as possible (at time of writing participation is still open - please do take part), and we will also analyse the data points associated with over 100 case studies capturing the journey of People Analytics functions since 2015.
What are some of the topics we could explore in the ThinkTank - and would you like to help us answer them?
Hypothesis H1. Few leaders understand what predictive/prescriptive analytics means for HR (Source: People Analytics World 2017)
At People Analytics World 2017 in April (the sister event of People Analytics Forum) we asked attendees to classify their functions' capability as either Descriptive ('What happened?'), Diagnostic ('Why did it happen?'), Predictive ('What will happen?') or Prescriptive ('What should happen?')*. Whilst the exercise was limited to conference attendees, it revealed just how few people were comfortable with what predictive and prescriptive analytics mean for HR, and confirmed that much of the industry believe they are descriptive or diagnostic. Through the research, we hope to explore and explain what these terms should mean, with actions plans for how to achieve them.
Hypothesis H2. It is possible to explain the differences between different types of analytics function
We next asked these respondents to assess their functions’ strengths and weaknesses using a tool we created specifically for our research, resulting in the diagram below (more details on the methodology can be found here). Whilst the sample sizes were low (self-categorised predictive and prescriptive functions are rarer to find), the difference across all six strategic development areas for people analytics fit where attendees said they were. The research will look at what these differences are, and what is required - in terms of your stakeholders understanding, objectives, capabilities, how to integrate with the business, technology and more - to achieve advanced growth.
Hypothesis H3. It is possible to plan how long a function takes to grow (Source: 3n Strategy Research, 2016)
There are very few people analytics leaders who are not building a business case to resources to start or grow their function. Producing business cases requires the ability to articulate the value the function will create, the types of investments required (across the six strategic development areas), the costs, and time frames. In 2016 we were able to indicate how long it might take to develop a function (see below), enabling analytics leaders to articulate how, when and when they need to make different types of investment. How will this have changed in 2017?
Hypothesis H4. There are certain (replicable) actions that all effective people analytics functions take
Both our quantitative and qualitative approaches are quite detailed, in order not to overlook relevant People Analytics experiences of our research participants, to make inferences for the upcoming report, and to give us a direction for the next steps. Last year we were able to identify key actions around stakeholder education and relationships, technology, capabilities and more – this year we expect to build on this.
Hypothesis H5. Some areas of HR are easier to analyse than others
Are there some areas of HR, such as skills and competencies, harder to work with? When we looked at that last year, it rang truer than we anticipated - it appeared that it only functions with higher overall maturity (represented by the grey line) were able to work with competency, productivity, and succession data. Given that there appear to be some areas of HR easier to analyse, does this mean some should be avoided when beginning the People Analytics journey? And for more advanced organisations, are there better ways to prepare for these types of investigation?
If you are a People Analytics leader and you are interested in taking part in our research, please let us know. Participants will each receive a free copy of the report with a few extra bonuses. More information is available on www.hranalyticsthinktank.com, or you can email me on ndias@3nstrategy.
Both I and Dr. Charlwood will be at the People Analytics Forum on the 29/30 November.
*We did not come up with these four stages nor associated question, and I am unable to find the original article I read suggesting them. The reference will be added later if located.