logo image

TD Magazine Article

New Learning Analytics for a New Workplace

We need to rethink learning analytics with a focus on value as opposed to learning as a key benchmark. Learning and development organizations historically have borrowed models for measuring learning from an increasingly archaic education system. The learning profession today is no different--it continues to focus on metrics that provide...

By

Thu Feb 02 2012

Loading...

We need to rethink learning analytics with a focus on value as opposed to learning as a key benchmark.

Learning and development organizations historically have borrowed models for measuring learning from an increasingly archaic education system. The learning profession today is no different--it continues to focus on metrics that provide a binary assessment of learning in the form of pass-fail, complete-incomplete, and started-in progress.

Advertisement

Today our analytics are becoming irrelevant and misleading as learning becomes more fluid. The traditional "push model" derived from a regulatory, compliance-driven industry is giving way to a learner-centric "pulling world" where mere training completion has little meaning. We need to rethink learning analytics with a focus on value as opposed to learning as a key benchmark. Our analytics must be aligned with the businesss metrics, and we must demonstrate value through the synthesis of a variety of business systems.

What do our analytics aspire to measure?

We typically gather three types of measurements in today's learning environment: learning, satisfaction, and impact. The tools used to obtain these metrics are relegated to some form of binary assessment of student performance on tests designed against a set of learning outcomes. The resulting analytics show how individuals and groups of learners have scored. Other metrics include the amount of time spent on a course, the number of attempts taken on a test, the kinds of modules accessed, and a host of other peripheral data that inform the one all-important statistic: having learned versus not having learned.

The second type of data learning professionals tend to collect is a measurement of learner satisfaction typically obtained via a smile sheet. Often this tool is more about measuring the quality of course design than whether the course produced the desired outcomes. It also is the basis from which we inform our judgment about whether a course has been valuable for learners.

Once we have our base metrics of student performance, we then aspire to measure whether a specific initiative had the desired result on the business for which the program was designed (as suggested by the upper levels of the Kirkpatrick model). It is fair to say that most organizations do not even try to create the tools necessary for this final type of measurement, arguing correctly that there are far too many influential factors that can affect a business, many of which are too difficult to isolate for objective measurements. Therefore, "impact on the business" may very well be a good indicator that what we want from the training has taken place.

Advertisement

So what's broken?

In a blog post written last year titled "Fundamental Design of Learning Activities," Aaron Silvers provides a vision for learning activities based on the notion of experiential design. At the heart of his post is the idea that a learning activity doesn't create a universal experience for everybody, nor can a designer predict how people will experience the design. Two employees' experiences of the same learning activity may be different, and so the resulting learning is based on the individual.

Consider the notion that learning never happens in the moment of the experience itself; instead, it only happens after the experience during an "aha moment." We certainly never experience (see, hear, touch, taste, or feel) the learning itself, and we dont feel the change of learning. We simply find that at some point after our experience, we have changed.

Learning 2.0 practitioners have been arguing for some time that the metrics previously used for formal learning are insufficient for capturing any data from informal initiatives such as online chats. Some would argue (and I used to be one) that there is no relevant data to suggest that informal learning has any effect on a business, and they would be right. However, the conclusion reached doesnt mean that there is no value, only that the instruments we use to measure how training affects an organization are insufficient. A learning culture that thrives based on the fluidity of content, in which learning is stimulated based on a series of experiences that are shared, and where "learning 2.0" flourishes, requires different criteria for success.

Additionally, in todays corporate learning landscape, Google and its competitors have forever changed our expectations of how to acquire knowledge and skills. The immediacy and accessibility of content through on-demand platforms are seemingly trumping design. At a recent conference held by the eLearning Guild, there was a significant amount of chatter about the "granularization of content" and the need to make content accessible on demand. There also is a continuous stream of chatter on blogs and discussion boards about the idea of content curation.

Advertisement

Advancing analytics

In May 2010, Dan Pontefract wrote the following in his blog post titled "The Holy Trinity: Leadership Framework, Learning 2.0 & Enterprise 2.0": As I've written about previously, I believe that an organization needs not only an internal 2.0 Adoption Council, \[but also\] a cross-functional team (the Enterprise 2.0 Org Structure) to help ensure all the various pieces of a 2.0 world seamlessly come together, mitigating any confusion for the employee, partner, or customer.

Here is where the questions about analytics become interesting. The nature of web 2.0 and social networking is to bridge the chasms between customers and sales, marketing and the customers voice, and training organizations and trained employees. The role of analytics within these spaces is to build tools for measuring the effects of decisions and actions in one facet with other facets of the environment. Imagine the following simple example.

Company ABC releases a new product into the market. ABC has tapped into social networking for its marketing and training. ABC's marketing team captures some metrics around its social networking, such as number of mentions and number of retweets, to gauge whether its customer base is showing interest in its activities. ABC's training organization also has different metrics from its learning management system, but does not capture any metrics about social networking in a training environment or look at marketings data to inform its own measurement strategy. At the time of the product release, ABC ensured that all employees completed the training and passed the final test. Marketing's metrics around its social networking strategies show a lot of chatter about the company. However, product sales are not happening.

How can analytics serve the organization holistically by capturing relevant metrics around both training and marketing efforts? How can analytics help the organization fix the issue of sales? What if among the training organizations social networks, data were being captured around keywords, and the organization was able to detect chatter trends among salespeople discussing how clients think the product is overpriced?

Measuring value

In that example, there is still a place for the analytics we gather via LMSs today. However, as stand-alone analytics, they provide little value other than leaving data for auditors to check off.

The problem with conventional learning analytics is twofold: First, we have set for ourselves an objective to measure learning when what were really interested in is performance. Second, in our attempt to measure learning, we've created binary models that do little except measure a student's performance on a test at the moment that he takes the test. We assume, based on our instructional design models, that performance on a test is a reflection of learning, but given today's models, it is easy to see why that may in fact not be the case.

To capture the essence of a new analytics model, consider the notion of value. The value of online content is measured today based on content's viral nature and potential. Business and web analytics address this issue by building ways to measure viral content and providing information about where the content access and exit points exist. It also is critical to understand that some content is best not going viral, and therefore setting benchmarks for your content is a critical piece to getting the most out of analytics.

It is vital to adopt a holistic view of what we want to achieve with our analytics. In the end, our goal ultimately should be to measure the contribution from the deployment of content through a training channel (informal or formal, for example) on the business based on specific goals identified upfront. We must understand what value the deployment of training content has had in achieving the performance goals of the targeted users.

Lets revisit company ABC. The company determines from its web analytics that consumers keep accessing its frequently asked questions (FAQ) webpage. ABC also recognizes from social media analytics that there are constant questions in discussion forums, blogs, and newsgroups about upgrading its software. ABC decides to ask its customer service staff if there are any trends in the questions people are asking, and if average call-handling times are within corporate standards.

As a result, ABC uncovers a trend related to the questions customers are asking and notices its average call-handling time recently has increased. ABC prepares to deploy training with the goal of decreasing average call-handling time in its customer service department and reducing the number of hits to the websites FAQ page. The company selects its sales and customer service staff to receive the new training and publishes the FAQ information on the websites product pages.

How is company ABC going to measure the success of its training? Consider analytics that include:

  • an increase in the number of hits to the product pages

  • a decrease in the number of hits to FAQ pages

  • an increase in the internal social media chatter about the product

  • an increase in the number of hits to performance support material provided by

  • customer service personnel

  • a decrease in the average call-handling time from internal analytics

  • an increase in employees' use of different media to access content (for example,

  • the website, the LMS, and performance support staff).

Evolve and grow with it

If we as learning professionals can embrace the notion that measuring learning alone may be difficult, but that measuring whether content is valuable to our audience is attainable based on the success of business and web analytics, then we have great potential for growth. If the purpose of training and development is to partner with the business, then what we are most concerned about is providing valuable learning content to our company that increases operational efficiency. If we can measure that value and show that it is driven by the bottom line, then we will see our learning metrics take a drastic shift.

If we can begin with the idea that some value of learning content is in its viral nature, then consider metrics such as:

  • Did employees access business-critical content? If so, when and how was it consumed?

  • Did employees share business-critical content with one another? How quickly did the content spread?

  • Did employees access business-critical content multiple times? Which employees are accessing the same content frequently?

  • Which business units accessed what content, and when?

  • As new content emerges, is it being consumed? When?

Our world is changing--evolve and grow with it.

You've Reached ATD Member-only Content

Become an ATD member to continue

Already a member?Sign In

ISSUE

February 2012 - TD Magazine

View Articles
Advertisement

Copyright © 2024 ATD

ASTD changed its name to ATD to meet the growing needs of a dynamic, global profession.

Terms of UsePrivacy NoticeCookie Policy