TD Magazine Article
Predictive talent analytics tells us whom to hire and how we should manage them.
Tue Jul 08 2014
You may not keep up with trade publications as often as you should, but if you caught the film Moneyball in 2011, you learned all about predictive talent analytics, the latest hot topic in the human capital industry.
Moneyball is the true story of how the general manager of the Oakland A's baseball team used predictive models to assess players' potential performance. Basing his acquisition decisions on this data, General Manager Billy Beane put together a team that then pulled off the longest winning streak in American League history.
HR didn't take long to make the connection: If baseball teams can use player statistics to predict performance, thereby gaining a huge competitive advantage, why can't companies do the same with their employees?
Organizations have long been accustomed to collecting certain human capital-related data. Trainers track the number of participants in their courses, the percentage of learners who passed the exam, and the level of learner satisfaction with the course. HR professionals measure retention, time to hire, and cost per hire. Senior leaders review the results of employee engagement surveys. These are all known as "descriptive" or "historic" metrics—they describe what has already happened or what is currently happening within an organization.
But predictive metrics do descriptive metrics one better: They describe future outcomes, serving as powerful decision-making tools. For example, a company tracks its annual retention rates and sees that retention has been decreasing during the past few years. Now, here's the point of transformation: Further analysis reveals that the turnover is mostly occurring in a department that requires long working hours. The company decides to reduce workload for these positions. It then observes an increase in retention for the department, which positively affects overall turnover.
Historic and predictive analyses are two sides of the same coin, says Alec Levenson, senior research scientist at the Center for Effective Organizations at the University of Southern California. "Conducting multivariate analysis on existing data contains both historical and predictive parts. It uses historical data to predict the future."
In other words, "A metric tells you that you have a problem," says John Sullivan, a leading HR strategist based in Silicon Valley. "A predictive metric tells you what to do about it."
As predictive talent analytics grows, consultancies have been cropping up to assist organizations that are struggling with their data. One of these is the Center for Talent Reporting, a nonprofit organization created in 2012 by KnowledgeAdvisors, a for-profit provider of analytics tools and consulting services.
The Center for Talent Reporting has developed standards, called the Talent Development Reporting Principles (TDRp), for defining and reporting on a variety of human capital processes. To its paid members, the organization offers access to a library of more than 500 measures relating to talent acquisition, learning and development, capability management, leadership development, performance management, and total rewards.
PwC Saratoga is a similar initiative from PriceWaterhouseCoopers that has established a metrics and benchmarking database, which contains "hundreds of global standards for metric formulas and data elements."
The Society for Human Resource Management, not about to be left behind in its own field, embarked on a similar attempt to create a set of standard HR metrics. However, SHRM's proposal, which would make data from leading companies publicly available for benchmarking purposes, was shot down in late 2012 by the HR Policy Association, a lobbying organization that represents 335 of the nation's largest corporations. According to an article on Workforce.com, the HR Policy Association claimed that the metrics would overburden employers and expose proprietary information to competitors.
Proprietary data is one reason that the movement to standardize human capital
metrics has had a lukewarm reception among some industry professionals. Although most organizations are eager to benchmark their results against their competitors', many are hesitant to reveal their unique formulas for success—especially when it comes to their talent.
Google, for example, is constantly lauded for its use of predictive analytics to inform hiring and management decisions. The company has developed algorithms to predict performance at all stages of the employee life cycle, including what makes a great manager and what makes an employee likely to leave, but Google keeps these algorithms close to its chest.
"It's like proprietary metrics in baseball," explains Sullivan. "Each team knows how their players' statistics impact their success, so they don't share that data with their competitors."
Other industry experts defend the value of standard human capital metrics and reporting processes. Kevin Oakes, CEO of the Institute for Corporate Productivity (i4cp) and a board member of the Center for Talent Reporting, concedes that while an organization will always have metrics that are specific to it, "TDRp does a great job of providing a foundation for human capital analytics and helps answer commonly asked questions" about what data should be collected and how these measures should be defined.
"TDRp merely provides a common vocabulary," says Oakes, "along with common statements, reports, and processes, which allow human capital professionals to speak the same language and improve their ability to have an impact."
As organizations prepare to dive into talent analytics, a jumping-off point may not be such a bad thing. A comprehensive library of measures and guidance on how to report on data findings is probably very welcome to many organizations. But experts like Levenson and Sullivan warn that organizations must learn how to interpret and act upon the data they are collecting—so instead of merely generating "endless reports that no one reads," they are identifying the most important metrics and turning them into powerful decision-making tools.
"The data needed for improving human capital decision making often does not lend itself well to standardized reporting," Levenson adds. "Less attention should be paid to uniform metrics across all employees or organizations, and much greater attention should be paid to the metrics that are most important for a given organization, business unit, function, process, or role."
In an article for The Atlantic titled "They're Watching You at Work," deputy editor Don Peck examined some of the most future-forward uses of predictive talent analytics. For example, Knack, a Silicon Valley start-up, develops video games that can predict an employee's potential as a leader or innovator. The games, designed by a team of neuroscientists, psychologists, and data scientists, generate several megabytes of data from a player's performance that give hiring managers insights into that individual's level of creativity, persistence, capacity to learn from mistakes, ability to prioritize, and social intelligence.
Gild is a company that uses analytics to identify promising software engineers. It scours the web for breadcrumbs of data that potential employees leave in their trail, including software engineers' coding activity on open-source platforms, participation on major social networks and other Internet forums, and data from past projects.
At the Massachusetts Institute of Technology, researchers have developed electronic "badges" that employees wear throughout the day that track information about their interactions with co-workers—the frequency and length of their conversations, their tones and gestures, how much they talk and listen, and how often they interrupt. The data are then analyzed to identify factors common to successful teams and effective leaders.
These technologies are considered "not only as a boon to a business's productivity and overall health, but also as an important new tool that individual employees can use for self-improvement," wrote Peck.
Other organizations have taken their data into their own hands. Farmers Insurance Group brings together multiple internal and external data sources in one proprietary online tool, where the data is used to build personalized development plans for each employee. PepsiCo redesigned its sales jobs and significantly increased employee pay after one analysis, which led to increased retention and greater capacity utilization. PriceWaterhouseCoopers reduced its turnover through a variety of initiatives after an analysis helped prioritize which levers would have the biggest impact.
If organizations know that predictive talent analytics is a promising endeavor, then why, according to research from i4cp, have less than 7 percent of organizations systematically undertaken it?
Many experts agree that there are simply too many metrics from which to choose. "Too often the cart is put before the horse," says Levenson. "Human capital metrics are selected to populate scorecards without validating that they are the measures that matter most for decision making and improving organizational effectiveness.
"Rather than choose metrics simply for the sake of having them, more analysis is needed at the front end to set up and test causal models of organizational performance," Levenson adds. "After you test a causal model and find the factors that matter most for improving performance, the metrics that you need to monitor and manage toward will be easy to identify."
These customized metrics, however, require analytical skills to build—skills that many organizations don't have. Although best-in-class organizations have dedicated analytics teams that mine data and design algorithms as part of their daily work, many HR and training functions struggle with basic data collection and interpretation.
"Our most recent research shows that HR is the department with the lowest analytical ability relative to all other departments within an organization," Oakes says. However, i4cp noted a trend in many HR functions of partnering internally with departments such as finance or marketing, which have more statistical expertise.
An organization that breaks down its silos to share data between functions will establish, if not stunningly accurate predictive models, more effective cross-functional communication. Data silos are a common problem within organizations, severely limiting the data's impact.
In a December 2013 T+D article, Elliott Masie, head of the organizational effectiveness think tank The Masie Center, wrote: "Imagine correlating performance reviews with learning activities and hiring data, either for thousands of employees or drilled down to a single worker." He continued, "The relationship between selection, training, and competency is very interesting. For example, we often evaluate the impact of a leadership program with the assumption that we did great things in the program. In reality ... much of it has to do with how well we select the program participants from our pool, and how well we select people to join the organization."
Finally, another challenge inherent in predictive talent analytics is that many professionals—HR practitioners, senior leaders, and employees alike—consider it an ethically loaded practice, rife with privacy, security, and transparency issues. Organizations must tread carefully in this new field because collecting and using certain kinds of data could lead to discrimination lawsuits. There also is the potential emotional impact on employees themselves, of constantly being under the microscope.
In The Atlantic, Peck concluded that, ultimately, these new tools to help get people into the right jobs and, once there, to help them succeed, "strike me as developments that are likely to make people happier."
Undoubtedly, predictive talent analytics is a step forward for our society, which always has relied on human intuition to make decisions about people's livelihoods. This has allowed hidden biases to infiltrate hiring and management decisions—even today, when organizations are legally bound to inclusive policies.
But most experts refuse to entirely toss aside human judgment in the workplace. "Nothing in the science of prediction and selection," wrote Peter Capelli, a professor of management at the Wharton School, "beats observing actual performance in an equivalent role."
"No matter how much data we collect," adds Levenson, "there will always be unexplained gaps that can only be resolved through direct observation and interaction—by talking to the people, their managers, and their work groups. The best diagnoses always use a combination of 'hard' data and qualitative information, gathered through direct observation and stakeholder interviews."
You've Reached ATD Member-only Content
Become an ATD member to continue
Already a member?Sign In