ISPI and Metrics

Measurement Counts! Metrics, ROI, and Accomplishments (the missing element)

A recent publication, Metrics, by Jay Cross of the Internet Time Group, presents an opportunity to comment on some current issues in measurement and evaluation. The author, who happens to be an old friend, is an entertaining and wide-ranging thinker (some might say Renaissance Man), and his book is noteworthy in part because of its unconventional form: a constantly updated eBook available for purchase online. Jay’s history in financial services, training, marketing, and a whole host of cerebral pursuits has left him most recently in the world of e-Learning, where he has become something of a pundit.

While I don’t agree with everything in Metrics, I recommend it because it’s a quick and enjoyable read, because it contains valuable references and links, and mostly because it challenges us to think outside many of the current ruts in measurement and evaluation.

Things I Like about Metrics
Here are some of Jay’s key points along with my comments:

  • “Metrics are measurements that matter.” With this sentence, he challenges us to measure results that our clients agree are important and to look for large valuable improvements. He adds, “Don’t fritter away time on the small stuff.”

  • “Start with business problems and work backwards.” He later adds that we should “focus on process not on behavior.” These comments point in the direction of our best strategy for measuring the right things, following Thomas F. Gilbert’s dictum to identify accomplishments, the outputs of processes or of individual jobs that contribute value toward business results. Behavior costs money while accomplishments have value. Following the path from business results back through measured accomplishments will lead to the behavior and improvement strategies that produce worthwhile organizational outcomes.

  • “Forget measurement of value based on cost savings!” As an e-Learning strategy consultant, Jay has probably tired of cost justifications based on saving travel time and expenses. It is critically important that we find ways to use our technologies and interventions to improve outcomes, not simply reduce costs for the same (often mediocre) outcomes.

  • “Time matters.” Whether we’re speaking of time to perform (fluency, productivity), time to achieve benchmark performance (ramp-up), or results over time (revenues, profits), we cannot ignore the time dimension in either our measurement of learning and performance during training or our measurement of desired business results.

  • “Gather baseline data.” While it is easy to interpret this statement as simply that we need a “before” measure to evaluate the worth of our “after” results, the “line” in baseline is very important. To clearly understand the effects of our interventions, we must view current performance in the context of measured levels, trends, and bounce (or variability) over time. We need a series of counts (per minute, per day, per week, or per month) to establish a true baseline so that we can tell whether our interventions or ongoing efforts are changing trends, levels, and/or the “bounce” (variability) of measured outcomes.

  • “You must be able to relate your decisions and choices to the profitability of your organization.” While much of Jay’s discussion focuses on what I call “validation data”—measurement to justify expenditures by showing that programs work—the best measurement systems support ongoing decision-making. This is why I recommend ongoing measurement as feedback to performers and decision-makers, and why I like Timm Esque’s book, Making An Impact, so much.

  • Jay disagrees with much of the current thinking about ROI, suggesting that his book can save you the cost of an ROI workshop. Whether or not this is true, managers would certainly prefer to see how your program improves their specific outcomes beyond a general payback ratio or cost justification. And since some current-day ROI “methods” use subjective estimates of payback rather than direct results measures, we need to question in detail many ROI claims before we accept them.

Things I Don’t Like So Much About Metrics
Lest you think I’m giving my friend a free pass, let me make a few comments about shortcomings.

  • The second half of the book is mostly a justification for e-Learning, something I would have preferred left to a few pages. I recognize that Jay makes his living in this field, but it would be more helpful if the book addressed the general case with a broader set of examples. Moreover, it is inconceivable that even the best e-Learning program will produce optimal results without efforts to improve other factors in a performance system, including expectations, feedback, tools, resources, consequences, and selection.

  • Jay does not discuss what’s a good measure and what’s not. For example, he mentions the limitations of test results as metrics but does not explain that percentage correct is not a measure of performance because it is a dimensionless quantity from which we cannot determine either the count of behavior or accomplishments nor the time required to complete them. He does not point out that the best metrics count things in absolute units (dollars, widgets, gallons, etc.) rather than rating them on subjective scales. Careful application of all his recommendations can still yield meaningless measurement if we fail to adhere to this basic principle.

  • Jay speaks of using both “subjective and objective” data. For me, the term “subjective” means open to wide interpretation or idiosyncratic. What I think he means by “subjective” is measurement of opinion or preference, which can be very objective if we refrain from applying “voodoo math” by summarizing numbers on rating scales of subjective impressions to produce “average” ratings. It is not subjective to say that “20 people out of 45 rated the program as very good and 10 said it was poor”—a quantitative measure of personal opinion that can be safely manipulated within the rules of arithmetic.

I suggest you read Metrics yourself, and discuss it vigorously with your colleagues and clients. I am sure you will find it both entertaining and illuminating.

Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance, and behavior to deliver measurable results. He may be reached at [email protected]. For additional articles, visit http://www.binder-riha.com/publications.htm.


Posted by Jay Cross at March 10, 2004 07:05 PM | TrackBack
Comments

30 Poppy Lane
Berkeley, California 94708

1.510.528.3105 (office & cell)



Subscribe to this Blog

Enter your email address to subscribe. We vow never to share your information with anyone. No Spam.

Subscribe Unsubscribe

Reference Pages

Articles
Blogs
Building Community
CSS, Semantic Mark-Up, and codes
Design
First Principles
Glossary
How People Learn
Knowledge Management
Learning Links
Learning Standards
Making It Work (Implementing)
Metrics & ROI
Presentations
Psychology
Social Software
String theory
The eLearning Museum
Time
Visual Learning


Search


Our Infrequent Newsletter
Sign up for our sporadic newsletter.
Email:


Entries by category...

Blogging
Books
Collaboration
Customer care
Design
Emergent Learning
handbook
Jokes
Just Jay
Learning
Meta
Networking
Outbound
Recycled from Blogger
Ref
store
The Industry
Time
Visual
Workflow-based eLearning


Blogroll


Internet Time Group



© 2004 Internet Time Group



Click for Berkeley, California Forecast
Berkeley, California


Recent entries

New Blog
Blogger Experience, Housekeeping, Something New
Loosely Coupled
Above all
Demographics is destiny
Are you setting the bar high enough?
Virtual Apps
Aerobic Learning
Work as Video Game
Oracle and Macromedia, Sitting in a Tree
The Blogosphere
ASTD Silicon Valley
Performance Support
Kingsbridge Conference Center
First Post by Email
Transition
Inactive Blog
RSS Feed for New Site
Comment Spam
Testing ... testing ... 1...2..3
IT Doesn't Matter - Learning Does.
All blogging is political
Mutlimedia Learning
Damn, damn, double damn
Nonverbal impact?
The New Religion
Shhhhh.....
Wolf! Wolf! Wolf! Wolf! Wolf! Wolf!
Business Process Management (2)
Really Big
Business Process Management Conference
WorkFLOW
Don't Lose a Common Sense: LISTEN
It's only natural
Gmail!
Go with the flow
Time Out for the Fair
Informal get-together in SF this Wednesday
Repetition, reverb, and echoes
Who Knows?
Ur-blogging
Cognitive Mapping
Push vs pull
The Big Picture on ROI
Art Break
TDF Finale
New Community of Practice Forming
Dropouts
More TDF04
Training Directors Forum 2004
A Rare One-Liner
PlaNetwork LIVE 2
PlaNetwork LIVE
ASTD 2004 Leftovers
Googlism
Worker Effectiveness Improvement, not KM
Upcoming Events
eLearning Effectiveness?
Jay's Talk at ASTD
Mintzberg & Cooperider
Lest ye forget
ASTD International Conference & Exposition 2004
Knowledge Tips
What is Workflow Learning?
ASTD msg 1 of n
Look out, it's Outlook
Collaboration at ASTD Next Week
Tell me a story
User indifference
Interdependence
The shortest presentation on metrics you will ever hear
Back to Blogger
Windows fixes
The Alchemy of Growth
Grab bag
Very loosely coupled
E-Learning from Practice to Profit
Robin Good kicks off Competitive Edge
China Bloggers
Sonoma Dreaming
Upcoming Events
Emergent Learning Forum: Simulations
'Lanta
The Best Things in Life Are Free
Metrics and Web Services
OpEd: ROI vs. Metrics
e-Merging e-Learning
Loosely Coupled
Search me
Exercise?