Why am I still reflecting on the pros and cons of learning technologies rather than my own practices whilst training it?
I've come to realise that I am not just training, I am also promoting. That successful implementation of WebLearn requires all staff to be on board. This is very different to the kind of education offered by a lecturer. I want to have a clear understanding of the pros and cons of the technology I am selling to staff and ultimately, I would like to have a say in the kinds of technologies I train staff to use in the future.
"At least three basic problems have dogged most attempts to translate technological investments into improvements in educational outcomes.
1. Surrendering to rapture of the technology. For a variety of reasons, institutions and programs tend to focus just on the new technology itself. That’s bad. To put it metaphorically, you must have yeast to bake bread, but if you buy only yeast, you’ll never produce bread. Whether a program’s aim is to use technology to support learning communities, or better skills or inquiry, or an internationalized curriculum, the recipe will require more than hardware. Other expensive ingredients include staff development and/or new staff; new assignments and course designs; more books in the library; altered marketing and advising; changes in roles and rewards; new organizational partnerships; and new internal coalitions. In the past, the technology siphoned money and attention away from the rest of the recipe.
2. Forgetting that the life span of many new technologies is far shorter than the time it takes to implement the recipe and improve educational outcomes. Those complex recipes are not “quick and easy”: assembling the ingredients takes a long time. Meanwhile the technology is aging, and losing value. Long before outcomes have a chance to improve visibly, new technologies usually distract attention from the “old” improvement agenda. Over the years, technology-related interests in improving outcomes such as programming skill, visualization (in the early days of videodisc), and collaborative learning (the computer conferencing systems of the 1980s) have risen, and fallen, and sometimes risen anew. Over the decades, waves of new technology have rippled across the surface of education but large-scale improvements in outcomes almost never had time to develop.
3. Trying to improve outcomes and save money by using tutorials and other forms of self-paced, interactive, branching courseware. This is one educational recipe for improvement of outcomes that hasn’t changed. It has been attempted with almost every new computing technology of the last four decades, from PLATO to the Web. These kinds of tutorials are always enticing: research has demonstrated that such courseware can dramatically improve outcomes, learning speed, and costs. But the problems of large-scale implementation have always proven insuperable: the short lifecycle of the courseware; the expenses of educational debugging of the many pathways; the hidden costs of altering the curriculum to take advantage of the courseware; the rigidity of the courseware in the face of new developments in the discipline and variations in students; the lack of rewards for authors; and the expenses of marketing and support." Stephen C Erhmann, Improving the Outcomes of Higher Education: Learning from Past Mistakes
No comments:
Post a Comment