You can’t become an expert in college


Cover of "Outliers: The Story of Success&...

Cover of Outliers: The Story of Success

Here’s something of an epiphany I had at the ICTCM while listening to Dave Pritchard‘s keynote, which had a lot to do with the differences between novice and expert behaviors in problem-solving.

Malcolm Gladwell, in his book Outliers, puts forth a now-famous theory that it takes at least 10,000 hours to become a true expert in a particular area, at the top of one’s game in a particular pursuit. That’s 10,000 hours of concentrated work in studying, practicing, and performing in some particular area. When we talk about “expert behavior”, we mean the kinds of behaviors that people who have put in their 10,000 hours exercise as second nature.

Clearly high school or college students who are in an introductory course — even Dave Pritchard’s physics students at MIT, who are likely several levels above the typical college undergrad — are not there yet, and so there’s not a uniform showing of expert behavior. There are more hours to be put in. But: How many more?

On the one hand, if a person spends 40 hours a week working at this activity, for 50 weeks out of the year, then it will take 5 years to reach this level of expertise:

(10000 hours) x (1 week/40 hours) x (1 year/50 weeks) = 5 years

But on the other hand, a typical college student will carry a 16 credit hour load, which means 16 hours of courses per week. If the student does this over a 14-week semester, and if the student takes the standard advice of spending 2 hours outside of class for every hour inside of class, and if the student undergoes two semesters of classes every calendar year, how long does it take to get to 10000 hours?

10000 hours x (1 week/48 hours) x (1 semester/14 weeks) x (1 year/2 semesters) = 7.44 years

That’s fairly close to double the usual time it takes for people to earn a bachelor’s degree. And it assumes that all that coursework is concentrated into one area, which of course it isn’t.

So there’s an important truth here: Nobody can become an expert on something just by going to college. College might add the finishing touches on expertise that was begun in childhood — for example, with kids who start playing music or programming computers at age 6 — but there’s just not enough time in college to start from zero and become an expert.

This has implications for college coursework. Many of us profs have “expertise” in mind as the primary instructional objective of our courses, but this is quite possibly an unreachable goal for most students. Instead, along with reasonable levels of mastery on core subject content, college courses should focus on what students need for the remaining hours they need to get to 10,000. We should be teaching not only content in the here and now, but also processing skills and broad intellectual tools that set students up for success in continuing towards expertise after college is over.

We can’t make students experts in the time we have with them, probably, but we can put them in position to become experts later. Ironically, the harder we try to make experts out of everyone, the less we stress broad intellectual skills, and the less likely they are to become experts later. How are students supposed to continue to learn, practice, and perform to get to that top level if nobody teaches them how to think and learn on their own?

Reblog this post [with Zemanta]

11 Comments

Filed under Education, High school, Higher ed, Life in academia, Problem Solving, Study hacks, Teaching

11 responses to “You can’t become an expert in college

  1. JD Stone

    While I agree with you, I’ll admit your post leaves me looking a little sideways. Pretty much in an “uhh, yeah” sort of way. Especially strange since you have quite a bit of experience as a professional educator.

    Do you have examples, from educators or otherwise, who assert that the purpose of an undergraduate curriculum is to create “experts”? Even at the most competitive and respected schools like Harvard or MIT?

    Just my opinion I suppose, and perhaps there will be others weighing in, but I guess I always figured that a Bx degree recognizes that the bearer has not much more than a broad and solid foundational knowledge in their major field of study. This base of core knowledge leaves them ready to *embark* on their journey toward expertise, starting with this base level of understanding and competence. Not to mention the core “GE” requirements outside the major to help round out the bachelor’s breadth of exposure to other fields, as well as further develop the critical thinking, creative, (and as you pointed out) problem solving skills of the student.

    Taking this further, what I describe above is actually fairly close to the original source of the term bachelor/baccalaureate. Nothing in the etymology of that word implies that the person described is a master or expert in their field.

    • I have to agree with Robert. While I know of no one who would specifically say, “Undergraduates should become experts,” I frequently see professors (including me, at times) behave this way. I have two common behaviors that I think support the idea that professors sometimes try to create experts:

      1. I know of many professors who pride themselves on being a “tough” teacher. They may brag about how difficult it is to get a good grade on their classes. It seems like one way someof these professors accomplish this is by teaching a graduate-level class, thinking that the students will end up with a graduate-level, or expert, education as a result.

      2. Professors often complain about the lack of retention from previous classes. “The students had linear algebra just last semester, and they already forgot what an eigenvector is!”. This would seem to me to be an expectation that students are supposed to become experts (an expert is eigenvectors, at least) once they have seen the material, too.

      So I do not think that this is akin to a strawman argument.

  2. Pingback: Weekend miscellany — The Endeavour

  3. Brendan Dowling

    It seems to me that becoming an expert is what a doctorate is about. Attaining an undergraduate degree is about becoming a beginner.

    Of course, in some disciplines (medicine, law), acquiring a doctorate means that you are only qualified to be a beginner in that field. Doctors have to do their residency. Lawyers typically put in long hours for relatively low pay. Only after years working in a field and gaining levels of responsibility do you get to be considered an expert.

  4. What if you start early? The typical geek starts programming in middle school (I started 12-13). By the time he earned a degree he’s 22-23, and he practiced almost 10000 hours in 10 years. With flat access to Internet available we are not limited to learn only from schools.

    • JD Stone

      Giorgio: I agree with you also. But I think your comment is outside the scope of Dr Talbert’s point.

      I think most would agree that one can become an expert at something by the time they *happen to be* in college. One could also become an expert before going to college, or without going to college.

      I think Dr Talbert’s point is that one cannot become an expert at something as a direct result of the education from an undergraduate curriculum alone.

  5. IMO true expertise requires practical knowledge which is not usually available in academia. Does the typical geek who starts progamming in middle school have the understanding of what end users are likely to do with his product? And more, what are the ramifications for the publisher or manufacturer of that end user’s experimentation or just plain bumbling? How often do we see, especially in the world of software/firmware, etc., a corporation having to deal with the OOPS factor.

    I guess expertise can be split into two areas: how to come up with the most elegant result; and how to develop the most practical application. Depends on your perspective.

  6. Jeff Walker

    The typical geek probably does start programming early. The typical undergraduate CS major has not. The majority of the majors in my intro to programming class have never programmed a single line of code before.

    There’s a couple factors – first, many majors really want to do hardware/systems admin work, but 4-year colleges don’t offer degress in that. I try to suggest 2-year community college programs, but by the time I advise them, it is really too late for them to consider a program that would be a better fit. Second, many majors think they want to program because they think computer games are cool. Now mind you, they aren’t particularly strong at math, nor do they want to take a lot of math, but they still want to program computer games. Um, ok.

    It is the self-taught geeks that are most successful as their learning curve is much lower. They may or may not be decent at structured programming. They may, or may not, understand the value in meaningful variable names. They probably do have decent debugging skills already. A few have actually read some of the classic programming books (such as Code Complete) and are really ready to excel. In any case, picking up some additional skills is much, much easier for them than someone starting from scratch.

  7. Mike S

    Most of my professors thought that they were giving their students a good grounding in a subject, not that they were making them experts. Several professors more or less explicitly said that in the real world, people were doing much more sophisticated/complex stuff than we were.

  8. Aaron

    While I agree that an undergrad education isn’t meant to make a person an expert in anything, I think “can’t” is overstating. I think many students in technical or competitive fields will put in significantly more than 2 hours of work outside of class per hour in the classroom. And to assume that students are only involved in relevant material that’s directly related to classes is unfair, as is assuming that they’ll do no relevant work or study between terms. Undergrad involvement in research, related internships/co-ops and student groups can all be valuable additions. If your students can put int 50 hours a week for 4 years, the could plausibly be experts by the time they leave. And with some highly focused programs that’s possible. I, for one, have been very impressed by the skills from interns coming out of the Waterloo software engineering program.

  9. @JD: To your first comment, I have never heard anybody actually assert that the purpose of the undergraduate curriculum is to create content experts. But there are hints and implications to this effect everywhere. I can certainly point to my own tendencies to expect expert behavior from undergraduates and feeling like they’ve failed — like I’ve failed — if they don’t show it. I think there may be some of this in the current push among many small liberal arts colleges to incorporate undergraduate research at a widespread level (so that doing research in a content area is almost expected of most or all students at an institution supposedly committed to broad, liberal education). So no, I don’t have any evidence, just drawing on my experiences and what I think I’m seeing currently.

    As to your second comment, your assessment of my main point is exactly right.

    I also totally agree with Brendan. An undergraduate education should broadly prepare someone for any of a number of future paths; graduate school is an intensive experience designed specifically for getting people to the 10,000 hour mark and then some.

    @Mary Foxworthy: Expertise looks like a lot of things, and I think we identify it more in behavior than in skills. Experts seem to approach all problems more or less the same way, even if it’s outside their field or just in everyday life. I’m thinking more and more that undergraduate education has, or should have, a lot more to do with inculcating expert behaviors than it does with any kind of content knowledge acquisition.

    @Aaron: The word “can’t” is mainly there to catch your attention and make you read the article.šŸ™‚