Average velocity is another one of those basic calculus (really pre-calculus) topics that, like difference quotients, leave me at a loss for why students have such a hard time with them. There’s a very simple and common-sense definition, namely that the average velocity of an object with position s(t) from t = a to t = b is

(See? It’s just distance = rate * time solved for “rate”.) There are examples in the book and examples on the internet *ad infinitum* of how to calculate average velocities, and all of these are simple numerical calculations with absolutely no algebra involved. You have to know how to plug numbers into a function and then do basic arithmetic on your calculator. That’s all.

But students get so turned around. They calculate only the position at time t=b. They add up the positions at t=a and t=b and divide by 2 (“average”). They add in the numerator or denominator (or both). They get the fraction upside-down. And so on. Not all students of course, but many of them — a lot more of them than there should be. And in my calculus classes, it’s certainly not for lack of training data; we’ve done it in lecture, in group activities, in online videos, you name it.

With difference quotients, I can sort of understand where the difficulties might come from — it’s the algebra. But there’s no algebra at all in an average velocity calculation, and even if you struggle to get the concept, can’t you just memorize the formula for the time being? I try always to see student difficulties from the student’s point of view and remember that I was in their shoes once too, but honestly, I am finding it really hard to know where such a consistent mass misunderstanding of this particular idea comes from.

What’s with this topic? Anyone?