Metrics & ROI

Four years ago I attended a how-to-ROI presentation at a major eLearning event and found it so misleading that I began writing about how companies really evaluate project potential and after-the-fact results.

Recently I've noticed ROI Workshops popping up. Spend a couple of days and the better part of a thousand dollars. Get a certificate. Such a deal. Unfortunately, neither the workshops nor the conference presentations cover the things I deem important:

  • Metrics are in the eye of the beholder. They are not simply the application of a rote formula or accounting rule. They are subject to interpretation. This is what makes them worthy of discussion.
  • The internal customer for metrics is your sponsor, also known as the person who pays the bills. When you talk with an executive, you need to talk about execution, not training.
  • The only valid metrics for corporate learning are business metrics. To converse in business terms, it helps to be fluent with the concepts of trade-offs, risk assessment, expected value, focusing on core, changing perspective, the 80/20 rule, and the bottom line.
  • Business goals. Strategic initiatives. Quarterly objectives. New product introductions. Figure out what matters in your organization. Then show the connection between what you do and what matters. It will make you an insider instead of an outcast.
  • Kirkpatrick's four levels are a history lesson, not a guide to action. Imagine telling your sales manager that the sales force was well prepared ("Levels 1 & 2") but simply hadn't sold anything ("Levels 3 & 4"). Good luck in your next job.
  • Most of a company's value resides in the know-how and relationships of its people. Traditional accounting assigns these intangibles a value of zero. Hence, traditional ROI has little credibility with enlightened executives.

Rather than update my various white papers and articles, I have consolidated my thoughts into a single one hundred-page eBook called Metrics. Check it out.

Power Shift

eLearning infrastructure decisions are climbing up the corporate ladder. A few years ago, eLearning was pigeonholed as a cheaper, faster way to train employees. By default, eLearning decisions fell to the director of training or HR.

Now, functional managers are using eLearning to meet business objectives. Managers look beyond employees to customers, suppliers, and distribution channels -- everyone benefits from seeding eLearning throughout the value chain. This is where we are now, with eLearning decisions seesawing back and forth between can-do functional managers anxious to get on with it, and CIOs/CLOs who want to go the next step to enterprise solutions. Still rare but perhaps the next step in this evolution is the CEO who looks at eLearning as a competitive weapon, the way to create a nimble organization, improve customer service, move quickly, and stay ahead of the pack.


January 10, 2003. Those of you who've read my thoughts on ROI know that I believe cost/benefit analysis is manditory and most ROI calculations are utterly worthless. Thus, I was delighted to come upon Enough Already! Getting Off the ROI Bandwagon by Kevin Kruse (mistakenly identified as Kevin Kenexa) in the current issue of Chief Learning Officer magazine.

Kevin writes that:

    First came the articles, then the books, and now I see that an entire conference is devoted to the ROI of training. Obviously we're seeing a backlash against the orgy of IT spending of the late 1990s, and against e-learning initiatives that fell short of expectations. Personally, I think it's all hype, and I've had enough.

    First, many senior executives don't care about ROI. In Jack Welch's book, "Straight From the Gut," he tells of his decision to invest millions in GE's new Crotonville training facility, even while undertaking massive layoffs. He didn't have an ROI spreadsheet to tell him training was a good investment; he just knew that investing in talent was critical to GE's future.

    Second, ROI is an imperfect science that often involves making educated guesses at potential savings and gains. Senior executives know this, and they also know that there are many variables that can't be captured by a formula.

    Third, ROI guesstimates are often a cop-out for tougher measurements of results. How about measuring employee engagement scores before and after management training, or doing pilot studies of sales training programs that measure closing ratios and time-to-close?

Systems Changes

Traditional ROI has suckered corporations into evaluating learning initiatives on a project-by-project basis, and this has lead to supporting each new approach as if it existed in isolation. The Meta-Learning Lab is developing ways to improve the overall learning process.

Take the old cliché of "Give a man to fish and he won't be hungry today. Teach a man to fish and he will never be hungry again." (Excuse the sexism; this dates back several thousand years.) The Meta-Learning Lab's goal is to teach fishermen how to improve their catch.

Scientific rigor: The Baloney Detection Kit

How to draw boundaries between science and pseudoscience, or between useful metrics and pure hype. From Scientific American

1. How reliable is the source of the claim?
2. Does this source often make similar claims?
3. Have the claims been verified by another source?
4. How does the claim fit with what we know about how the world works?
5. Has anyone gone out of the way to disprove the claim, or has only supportive evidence been sought?

6. Does the preponderance of evidence point to the claimant's conclusion or to a different one?
7. Is the claimant employing the accepted rules of reason and tools of research, or have these been abandoned in favor of others that lead to the desired conclusion?
8. Is the claimant providing an explanation for the observed phenomena or merely denying the existing explanation?
9. If the claimant proffers a new explanation, does it account for as many phenomena as the old explanation did?
10. Do the claimant's personal beliefs and biases drive the conclusions, or vice versa?

"Clearly, there are no foolproof methods of detecting baloney or drawing the boundary between science and pseudoscience. Yet there is a solution: science deals in fuzzy fractions of certainties and uncertainties, where evolution and big bang cosmology may be assigned a 0.9 probability of being true, and creationism and UFOs a 0.1 probability of being true. In between are borderland claims: we might assign superstring theory a 0.7 and cryonics a 0.2. In all cases, we remain open-minded and flexible, willing to reconsider our assessments as new evidence arises. This is, undeniably, what makes science so fleeting and frustrating to many people; it is, at the same time, what makes science the most glorious product of the human mind."

Corporate Learning Strategies by Dan Tobin. "If you start and end all of your learning efforts by focusing on your organization's goals, you will never be asked to do an ROI analysis to justify your budget."


There's no cookbook approach to measuring the ROI of training. Fred Nichols is so right about this.
Because the definition and perception of value varies from person to person, so do the purposes of evaluation. Moreover, the various audiences for evaluation frequently act as their own evaluators. If you look carefully about you, or if you reflect upon your own experiences as a "trainee," you will quickly discover that training is being evaluated every day, but by trainees, managers, and executives -- and in accordance with their criteria and purposes.


Technology-enabled learning creates value by speeding things up. Business-school professors compare making big corporate changes to turning around the Queen Mary. Turn the rudder and in a few miles, the ship changes course. These days, organizations that lack the agility to turn on a dime can only go about as far as the Queen Mary (which is moored in cement alongside a pier in Long Beach, California.)

A Fortune 50 company used eLearning, knowledge management, and collaboration to bring new-hire sales people up to speed in six months instead of fifteen. Nine months x 1400 new hires/year x $5 million quota = $5 billion incremental revenue. To be sure, better products, sales campaigns, and a host of factors contributed to the gain but a tiny faction of $5 billion still yields a significant ROI. (Here are the details: New-hire training at Sun Microsystems.)

Ten thousand consultants at a Fortune 100 technical services company earned professional certifications via eLearning. The result? Less attrition, better esprit de corps, and $100 million revenue/year attributable to higher billing rates.

A software firm launches a new system into a $250 million global market with eLearning and virtual meetings. This accelerates time-to-market by two months, gives them first-mover advantage over a major competitor, builds a more confident and enthusiastic sales force, and gets the channel up to speed at the same time as the direct sales force. Gain? $80 to $100 million incremental revenue.

A very large retailer of personal computers realizes that customers are frustrated with their products because they don?t understand the software that accompanies them. The company offers customers free admission to an online learning community created by SmartForce. More than 100,000 customers sign up to learn Windows, Word, and Office apps online. Value of increased customer loyalty? Conservatively, $20 million in repeat business over three years.

Often an e-Learning initiative pays for itself right off the bat by eliminating travel and facility costs, but that misses the point, because in comparison, upside gains dwarf cost savings.

The eLearning Emperor Has No Clothes

Go to any major conference for trainers and you'll find many sessions on evaluating results and measuring performance. If you're a line manager with no training background, you will at first be confused when participants make statements like, "We evaluate 100% at Level 1, 80% and Level 2, and 40% at Level 3. We're going to shoot for some Level 4 next year."

The "levels" come from a taxonomy developed by a budding academic, Donald Kirkpatrick, as his Pd. D. thesis more than forty years ago. Level 1 evaluates trainee reaction (generally via evaluation forms derisively known as "smile sheets.") Level 2 checks retention (can they pass the test?) Level 3 looks at whether theydo what they were trained for. Level 4 is whether the learning creates meaningful results for the organization.

Picture this. A national sales manager is reviewing quarterly sales performance with his boss. He tells her the new sales trainees scored 95% on Level 1, 82% on Level 2, and 9% on Level 3. Unfortunately, Level 4 improvement was infinitesimal. So the sales force loved the sales training, the majority passed the test, and nearly four out of five could demonstrate great sales behavior in a role-play. The only trouble is Levels 3 & 4: they aren't selling. How long would the sales manager keep her job? In business, Level 4 is, in fact, the only thing that matters. No wonder senior managers question the value of training.

The only valid measure of training is business metrics, not training metrics.

As the Godfather said, "This is business." If you can't see a benefit, don't do it.

Jack Zigon's list of performance measurement sites

Excerpt from Ed Trolley's Running Training Like a Business

Evaluating e-Learning by Dorman Woodall

Jay's notes on making the business case, new ROI challenges

The trouble with the "four levels" is that they falter when they go outside of the limited context of training. What happens outside the box is what counts inside the box. See Measuring Training ROI & Impact (1999). You can guess how I see this.

The Evolution of Management Accounting by Robert S. Kaplan

BNH on ROI. Their software models simlify complex ROI calculations.

Discussion group: ROInet

The Business Case website

The Fallacy of ROI Calculations

Measuring the Success of Training

Baruch Lev

Measuring the ROI of Training, CIO

Economic Value Added (EVA)


Learning at home

Training magazine, the March 2000 issue: Train on your own time, not "during work."

Sure, moving training from the classroom to the Web can mean reduced travel costs, less learning time away from the job, and certainly lower delivery costs. But most corporate training doesn't require travel, says Paul Reali, president of CyberSkills Computer Training Centers in Winston Salem, NC. And, he points out, no valid study has yet shown that online delivery significantly reduces learning time - actual time spent mastering a skill or acquiring knowledge-compared with instructor-led training of similar quality.

"No one wants to tell you that the 'anytime' of online learning is supposed to be after work and that the 'anyplace' is at home," he says.

Another reason: Despite yellow crime-scene tape barriers and "do not disturb" signs, the cubicle is a tough place to have a quality learning experience. And it's almost impossible to reserve the necessary time and concentration without broad organizational support--and the backing of trainees' immediate managers for regular learning time-outs.

This is so true and so short-sighted.
Posted by Jay Cross at June 21, 2001 01:05 AM

cool stuff

Posted by: paris hilton nude photo at June 29, 2004 02:42 AM

30 Poppy Lane
Berkeley, California 94708

1.510.528.3105 (office & cell)

Subscribe to this Blog

Enter your email address to subscribe. We vow never to share your information with anyone. No Spam.

Subscribe Unsubscribe

Reference Pages

Building Community
CSS, Semantic Mark-Up, and codes
First Principles
How People Learn
Knowledge Management
Learning Links
Learning Standards
Making It Work (Implementing)
Metrics & ROI
Social Software
String theory
The eLearning Museum
Visual Learning


Our Infrequent Newsletter
Sign up for our sporadic newsletter.

Entries by category...

Customer care
Emergent Learning
Just Jay
Recycled from Blogger
The Industry
Workflow-based eLearning


Internet Time Group

© 2004 Internet Time Group

Click for Berkeley, California Forecast
Berkeley, California

Recent entries

New Blog
Blogger Experience, Housekeeping, Something New
Loosely Coupled
Above all
Demographics is destiny
Are you setting the bar high enough?
Virtual Apps
Aerobic Learning
Work as Video Game
Oracle and Macromedia, Sitting in a Tree
The Blogosphere
ASTD Silicon Valley
Performance Support
Kingsbridge Conference Center
First Post by Email
Inactive Blog
RSS Feed for New Site
Comment Spam
Testing ... testing ... 1...2..3
IT Doesn't Matter - Learning Does.
All blogging is political
Mutlimedia Learning
Damn, damn, double damn
Nonverbal impact?
The New Religion
Wolf! Wolf! Wolf! Wolf! Wolf! Wolf!
Business Process Management (2)
Really Big
Business Process Management Conference
Don't Lose a Common Sense: LISTEN
It's only natural
Go with the flow
Time Out for the Fair
Informal get-together in SF this Wednesday
Repetition, reverb, and echoes
Who Knows?
Cognitive Mapping
Push vs pull
The Big Picture on ROI
Art Break
TDF Finale
New Community of Practice Forming
More TDF04
Training Directors Forum 2004
A Rare One-Liner
PlaNetwork LIVE 2
PlaNetwork LIVE
ASTD 2004 Leftovers
Worker Effectiveness Improvement, not KM
Upcoming Events
eLearning Effectiveness?
Jay's Talk at ASTD
Mintzberg & Cooperider
Lest ye forget
ASTD International Conference & Exposition 2004
Knowledge Tips
What is Workflow Learning?
ASTD msg 1 of n
Look out, it's Outlook
Collaboration at ASTD Next Week
Tell me a story
User indifference
The shortest presentation on metrics you will ever hear
Back to Blogger
Windows fixes
The Alchemy of Growth
Grab bag
Very loosely coupled
E-Learning from Practice to Profit
Robin Good kicks off Competitive Edge
China Bloggers
Sonoma Dreaming
Upcoming Events
Emergent Learning Forum: Simulations
The Best Things in Life Are Free
Metrics and Web Services
OpEd: ROI vs. Metrics
e-Merging e-Learning
Loosely Coupled
Search me