Saturday, August 08, 2009

Truth, Honesty, and Evidence

Evidence-based practice is an area of focus within the field of education in general, and in training and development in particular.

(Before going further, let me define evidence-based practice. It’s a decision-making process in which people base their choices on evidence from the research, adapted according to the characteristics of the situation.)

In the ABCDEs of Learning and Development’s Next Paradigm, published in the August 2009 issue of ASTD’s T&D Magazine, Benjamin Ruark serves as an enthusiastic proponent of evidence-based practice. In the article, he describes six reasons why it’s Learning and Development’s next paradigm.

Among the many reasons he proposes: it provides “a more accurate predictive focus,” it allows the discovery of “connections between . . . ‘research crumbs of effectiveness’;”it presents a broader, big-picture view; and it emphasizes the

Just one thing: evidence-based practice is NOT a completely new paradigm. It’s just the latest incarnation of a concept that’s been around for a while—that research needs to be transferred into practice. In fact, ASTD—the publisher of the article—published a whole series of “What Works” books under a contract from the US government to transfer research on training and adult learning into practice.

About 5 years ago, ASTD launched its Research-to-Practice Conferences, complete with published proceedings, that had the same general goal.

What’s different about evidence-based practice is that it doesn’t make recommendations based on a single study, as might have been inferred from previous efforts to transfer research to practice. As Ruark notes, “Research in the form of individual case studies being generalized into global performance improvement solutions is tantamount to deadheading into a curriculum’s instructional design based solely on some e-learning tool’s capabilities.” (And a body of evidence in management in general, and growing in training and development, suggests that managers primarily make decisions on instinct.)

Ruark acknowledges the limitations of the current system for transferring research to practice. As a solution, he suggests that a central authority is needed to vet and transfer research, that “new effectiveness findings from researchers get disseminated to the work world” and that “practitioners conduct what’s known as practice-based research, submitting preliminary actuarial data to a central research agency, university, or similar affiliate, for rigorous experimental replications. “

Like most other recommendations for transferring research to practice, these are not practical. Establishing a central authority sounds easier on paper than it will prove in reality. Who will organize the authority? How will all of the different disciplines that feed research into our work be accommodated—not only the obvious ones, like adult education, educational technology, and human resource development, but also fields like educational psychology, learning sciences, industrial psychology, and human resources management? Many work with different world views and research standards; agreement becomes a challenge.

As far as disseminating research to practicing professionals goes, the importance of doing so has never been questioned. What no one has figured out is how? Some studies suggest that practicing professionals aren’t reading research publications; editors of professional publications often have a journalism background and might not have awareness of current research in the field, much less historial research. There’s an interesting situation mentioned in a 2002 study by Rynes, Colbert, and Brown, in which the research indicates that graphology (handwriting analysis) is not a good predictor of future work performance, but a practitioner magazine published by a professional organization promoted graphology as the next big thing in selection.

If we are to disseminate research, it needs to start with the editors of publications becoming familiar, to make sure that the research evidence supports the suggestions for practice that they publish.

As far as practicing professionals engaging in research and submitting their data—one of the challenges of that is that the data that would be submitted would have to be open to verification. Given how protective many organizations are of their data and how sloppy others are with it, this bank of data is likely to have limited applicability.

This is not meant to throw cold water on the suggestions. Rather, it addresses the frustration with many such suggestions, most of which come from the researcher’s perspective.

Suggestions need to be more realistic, taking into account the reality that most practicing professionals lack the resources to conduct the type of research that would be necessary to provide validated data and have little time to familiarize themselves with the research.

Other models exist, but they do not exist within the mainstream of training and development. One practical model for disseminating research to practicing professionals is the website, usability.gov website. This website provides guidelines for designing effective websites. It not only summarizes the research for each guideline presented and provides references, but also identifies the strength of each recommendation (that is, how much research and what type of research underlies the recommendation).

Read Ruark’s article at http://www.astd.org/TD/Archives/2009/August/0809_ABCDEs_Of_Learning.htm (you might need to provide a userid and password to see the article).

No comments: