Image via Wikipedia
This article first appeared earlier this week on the group blog Education Debate at OnlineSchools.org. I’m one of the guest bloggers over there now and will be contributing articles 1–2 times a month. I’ll be cross-posting those articles a couple of days after they appear. You’d enjoy going to Education Debate for a lively and diverse group of bloggers covering all kinds of educational issues.
It used to be that in order to educate more than a handful of people at the same time, schools had to herd them into large lecture halls and utilize the skills of lecturers to transmit information to them. Education and school became synonymous in this way. Lectures, syllabi, assessments, and other instruments of education were the tightly-held property of the universities.
But that’s changing. Thanks to advancements in media and internet technology over the past decade, it is simpler than ever today to package and publish the raw informational content of a course to the internet, making the Web in effect a lecture hall for the world. We now have projects such as MIT OpenCourseWare, Khan Academy, and countless initiatives for online education at US colleges and universities providing high-quality materials online, for free, to whomever wants them. It brings up a sometimes-disturbing question among educators: If students can get all this stuff online for free, what are classrooms and instructors for?
Tech author Randall Stross attempts to examine this question in his New York Times article “Online Courses, Still Lacking that Third Dimension”. In the article, Stross mentions “hybrid” courses — courses with both online and in-person components — but focuses mainly on self-contained courses done entirely online with no live human interaction. He correctly points out that learning is an inherently human activity, and technologically-enhanced coursework is successful insofar as it retains that “human touch”.
However, Stross casts the relationship between computer-enabled courses and traditional courses as a kind of zero-sum game, wherein an increased computer presence results in a decreased human presence. He refers to universities “adopting the technology that renders human instructors obsolete.” But it’s not the technology itself that makes instructors obsolete; it’s the adoption of practices of using that technology that does. There are numerous instances of traditional college courses using computing and internet tools to affect positive change in the learning culture of the institution. There are also plenty of cases, as Stross points out, where technology has replaced human instructors. The difference is an administrative one, not a technological one.
Nor is the supposed obsolescence of the instructor all technology’s fault. If universities and individual professors continue to hold on to a conception of “teaching” that equates to “mass communication” — using the classroom only to lecture and transmit information and nothing else — then both university and instructor are obsolete already, no technology necessary. They are obsolete because the college graduate of the 21st century does not need more information in his or her head to solve the problems that will press upon them in the next five or ten years. Instead, they need creativity, problem-solving experience, and high-order cognitive processing skills. A college experience based on sitting through lectures and working homework does not deliver on this point. The college classroom cannot, any longer, be about lecturing if it is to remain relevant.
And notice that an entirely self-contained online course can be as “traditional” as the driest traditional lecture course attended in person if it’s only a YouTube playlist of lectures. What matters regarding the effectiveness of a course isn’t the technology that is or is not being used. Instead it’s the assumptions about teaching and learning held by the colleges and instructors that matter, and their choices in translating those assumptions to an actual class that students pay for.
What we should be doing instead of choosing sides between computers and humans is finding ways to leverage the power of computers and the internet to enhance the human element in learning. There are several places where this is already happening:
- Livemocha is a website that combines quality multimedia content with social networking to help people learn languages. Users can watch and listen to language content that would normally find its place in a classroom lecture and then interact with native speakers from around the world to get feedback on their performance.
- Socrait, a system proposed by Maria Andersen, would provide personalized Socratic questions keyed to specific content areas by way of a “Learn This” button appended to existing web content, much like the “Like This” button for sharing content on Facebook. Clicking the button would bring the user to an interface to help the user learn the content, and the system contains social components such as identifying friends who also chose to learn the topic.
- I would offer my own experiments with the inverted classroom model of instruction as an imperfect but promising example as well. Research suggests this model can provide in significant gains in student learning versus the traditional approach to teaching by simply switching the contexts of lecture and activity, with lecture being delivered via video podcasts accessed outside of class and class time spent on problem-based learning activities in teams.
Rather than view college course structure as a pie divided into a computer piece and a human piece, and fret about the human piece becoming too small, let’s examine ways to use computers to enhance human learning. If we keep thinking of computers as a threat rather than an aid to human interaction, computer-assisted instruction will continue to lack the human touch, the human touch will continue to lack the power and resources of computers and the internet, and student learning will suffer. But if we get creative, the college learning experience could be in for a renaissance.