# Tag Archives: Critical thinking

## A problem with “problems”

I have a bone to pick with problems like the following, which is taken from a major university-level calculus textbook. Read it, and see if you can figure out what I mean.

This is located in the latter one-fourth of a review set for the chapter on integration. Its position in the set suggests it is less routine, less rote than one of the early problems. But what’s wrong with this problem is that it’s not a problem at all. It’s an exercise. The difference between the two is enormous. To risk oversimplifying, in an exercise, the person doing the exercise knows exactly what to do at the very beginning to obtain the information being requested. In a problem, the person doesn’t. What makes an exercise an exercise is its familiarity and congruity with prior exercises. What makes a problem a problem is the lack of these things.

The above is not a problem, it is an exercise. Use the Midpoint Rule with six subintervals from 0 to 24. That’s the only part of the statement that you even have to read! The rest of it has absolutely nothing with bees, the rate of their population growth, or the net amount of population growth. A student might be turning this in to an instructor who takes off points for incorrect or missing units, and then you have to think about bees and time. Otherwise, this exercise is pure pseudocontext.

Worst of all, this exercise might correctly assess students’ abilities to execute a numerical integration algorithm, but it doesn’t come close to measuring whether a student understands what an integral is in the first place and why we are even bringing them up. Even if the student realizes an integral should be used, there’s no discussion of how to choose which method and which parameters within the method, or why. Instead, the exercise flatly tells students not only to use an integral, but what method to use and even how many subdivisions. A student can get a 100% correct answer and have no earthly idea what integration has to do with the question.

A simple fix to the problem statement will change this into a problem. Keep the graph the same and change the text to:

The graph below shows the rate at which a population of honeybees was growing, in bees per week. By about how many bees did the population grow after 24 weeks?

This still may not be a full-blown problem yet — and it’s still pretty pseudocontextual, and the student can guess there should be an integral happening because it’s in the review section for the chapter on integration —  but at least now we have to think a lot harder about what to do, and the questions we have to answer are better. How do I get a total change when I’m given a rate? Why can’t I just find the height of the graph at 24? And once we realize that we have to use an integral — and being able to make that realization is one of the main learning objectives of this chapter, or at least it should be — there are more questions. Can I do this with an antiderivative? Can I use geometry in some way? Should I use the Midpoint Rule or some other method? Can I get by with, say, six rectangles? or four? or even two? Why not use 24, or 2400? Is it OK just the guesstimate the area by counting boxes?

I think we who teach calculus and those who write calculus books must do a better job of giving problems to students and not just increasingly complicated exercises. It’s very easy to do so; we just have to give less information and fewer artificial cues to students, and force students to think hard and critically about their tools and how to select the right combination of tools for the job. No doubt, this makes grading harder, but students aren’t going to learn calculus in any real or lasting sense if they don’t grapple with these kinds of problems.

Advertisement

4 Comments

Filed under Calculus, Critical thinking, Math, Problem Solving, Teaching

## The case of the curious boxplots

I just graded my second hour-long assessment for the Calculus class (yes, I do teach other courses besides MATLAB). I break these assessments up into three sections: Concept Knowledge, where students have to reason from verbal, graphical, or numerical information (24/100 points); Computations, where students do basic context-free symbol-crunching (26/100 points); and Problem Solving, consisting of problems that combine conceptual knowledge and computation (50/100 points). Here’s the Assessment itself. (There was a problem with the very last item — the function doesn’t have an inflection point — but we fixed it and students got extra time because of it.)

Unfortunately the students as a whole did quite poorly. The class average was around a 51%. As has been my practice this semester, I turn to data analysis whenever things go really badly to try and find out what might have happened. I made boxplots for each of the three sections and for the overall scores. The red bars inside the boxplots are the averages for each.

I think there’s some very interesting information in here.

The first thing I noticed was how similar the Calculations and Problem Solving distributions were. Typically students will do significantly better on Calculations than anything else, and the Problem Solving and Concept Knowledge distributions will mirror each other. But this time Calculations and Problem Solving appear to be the same.

But then you ask: Where’s the median in boxplots for these two distributions? The median shows up nicely in the first and fourth plot, but doesn’t appear in the middle two. Well, it turns out that for Calculations, the median and the 75th percentile are equal; while for Problem Solving, the median and the 25th percentile are equal! The middle half of each distribution is between 40 and 65% on each section, but the Calculation middle half is totally top-heavy while the Problem Solving middle half is totally bottom-heavy. Shocking — I guess.

So, clearly conceptual knowledge in general — the ability to reason and draw conclusions from non-computational methods — is a huge concern. That over 75% of the class is scoring less than 60% on a fairly routine conceptual problem is unacceptable. Issues with conceptual knowledge carry over to problem solving. Notice that the average on Conceptual Knowledge is roughly equal to the median on Problem Solving. And problem solving is the main purpose of having students take the course in the first place.

Computation was not as much of an issue for these students because they get tons of repetition with it (although it looks like they could use some more) via WeBWorK problems, which are overwhelmingly oriented towards context-free algebraic calculations. But what kind of repetition and supervised practice do they get with conceptual problems? We do a lot of group work, but it’s not graded. There is still a considerable amount of lecturing going on during the class period as well, and there is not an expectation that when I throw out a conceptual question to the class that it is supposed to be answered by everybody. Students do not spend nearly as much time working on conceptual problems and longer-form contextual problems as they do basic, context-free computation problems.

This has got to change in the class, both for right now — so I don’t end up failing 2/3 of my class — and for the future, so the next class will be better equipped to do calculus taught at a college level. I’m talking with the students tomorrow about the short term. As for the long term, two things come to mind that can help.

• Clickers. Derek Bruff mentioned this in a Twitter conversation, and I think he’s right — clickers can elicit serious work on conceptual questions and alert me to how students are doing with these kinds of questions before the assessment hits and it’s too late to do anything proactive about it. I’ve been meaning to take the plunge and start using clickers and this might be the right, um, stimulus for it.
• Inverted classroom. I’m so enthusiastic about how well the inverted classroom model has worked in the MATLAB course that I find myself projecting that model onto everything. But I do think that this model would provide students with the repetition and accountability they need on conceptual work, as well as give me the information about how they’re doing that I need. Set up some podcasts for course lectures for students to listen/watch outside of class; assign WeBWorK to assess the routine computational problems (which would be no change from what we’re doing now); and spend every class doing a graded in-class activity on a conceptual or problem-solving activity. That would take some work and a considerable amount of sales pitching to get students to buy into it, but I think I like what it might become.

7 Comments

## Turning questions into learning

Image by CarbonNYC via Flickr

The hardest thing about teaching the MATLAB course — or any course — is responding to student questions. Notice I do not say “answering” student questions. Answers are not the issue; I’m no MATLAB genius, but I can answer 95% of student questions on the spot. The real issue is whether I should. If my primary task is to teach students habits of mind that translate into lifelong learning — and I earnestly believe that it is — then answers are not always the best thing for students.

I’ve noticed four types of questions that students tend to ask in the MATLAB course, and these carry over pretty seamlessly to my other courses:

1. Informational questions that have nothing to do with the problem they’re working on or the material. Example: When are your office hours? When is this lab due? When is the final exam?
2. Clarifying questions that seek to make sense of the specifications of a problem. Should we use a script M-file or a function M-file here? A FOR loop or a WHILE loop? Do I have to make this plot from the command line or can I use the Plot Tools window?
3. Functional questions that are generally of the form, How do I [insert task here]?. How do I plot a function in MATLAB? How do I get the plot to be red instead of blue? How do I get this FOR loop to work?
4. Interpretive questions that seek the meaning of syntax, a command, or an error. What does MATLAB mean when it says I should ‘pre-allocate’ this variable? Why are there all these different ways to call the MAX command? Why do I have to use num2str in some situations but not in others?

I’ve tried to list these question types in increasing order of cognitive complexity, although that ordering doesn’t always hold. (Some clarifying questions can get quite complicated, for instance.) How these types of questions are ranked in this way points me in the direction of how to respond to them.

Formal informational questions are easy to answer. I always give the same, direct answer to these: It’s (on the syllabus || in the calendar || printed on your assignment). Students learn pretty quickly that this is the same answer to this kind of question all the time, so they tend to stop asking and just look it up instead, which is precisely what they should be doing.

I’m happy to give straight answers to clarifying questions, although sometimes it’s better not to. For example, if a student team is working on a program that needs a loop, and they want to know if they should use a FOR or a WHILE loop, then the best way to respond is not to tell students what to do but rather to lay out the pros and cons of each approach and let them choose.

It gets very tricky when we get to the last two types.

I let the following basic philosophy guide me: I don’t answer functional questions on labs during lab sessions or on homework while the homework is still not yet turned in. Once the lab or homework is over, then I’ll usually answer directly. But otherwise, my goal is to guide students into turning their functional question into an interpretive question. I do this through a series of Polya-like questions to the students that flows a little something like this (click to enlarge):

This is less complicated than it looks. Basically, if a student asks a functional question, I first see if the student’s done what they’re asking before. If so, they go refresh their memories. If not, they look it up in an appropriate help file until they find something that looks like what they want. Then they play with it for a minute or two to get the basic idea. Then, by that point, they either know what they’re supposed to do, or else they have a deeper, more cognitively complex question to ask, not What do I type in to do this? but Why does this work the way it does? In a freshman-level class like this one, any time I can get students to elevate themselves from functional questions to interpretive/clarifying questions, I consider it a win.

What students get out of this process is the ability to move beyond needing the professor to tell them what to do. They become self-feeders. This is important because the professor is not going to be there when they really need this stuff, two or three courses down the line or when they’re out on the job (for which they were hired because they, and nobody else around them, has these kinds of computer skills). They are getting the ability to learn on their own, which is what I consider really to be the single, primary life skill.

Unfortunately students tend to resist this process. It is not what they are used to. They are used to teachers telling them the answers to their questions, regardless of what kind of question it was, and to them a failure of a teacher to give a straight answer to their questions is tantamount to either incompetence or indifference. So this process requires a constant P.R. effort and constant clarifying about why we do things this way. And that P.R. effort doesn’t always work. I still have students who complain that I don’t answer their questions; who feel belittled when they identify that they’ve seen a command before and are asked to go back and review it; who feel questions are pointless because I’m just going to ask them more questions in return.

These are freshmen, used to a transactional model of education predominant in American high schools. The fact that this model — the teacher tells the students what to do; students follow teacher’s directions; students get good grades — is the predominant one is a serious problem in our schools, but that’s another issue. Whatever the case may be, I am getting these folks in the final four years of their formal schooling (for the most part) and if I don’t get them thinking on their own, they will crash and burn in the real world.

And I think that even if these students go on never to use MATLAB again after graduation but have a well-practiced and fluid ability to learn new and complicated things on their own, I consider that the biggest win of all. And it’s a good reason to take the MATLAB course in the first place.

2 Comments

## MATLAB and critical thinking

My apologies for being a little behind the curve on the MATLAB-course-blogging. It’s been a very interesting last couple of weeks in the class, and there’s a lot to catch up on. The issues being brought up in this course that have to do with general thinking and learning are fascinating, deep, and complicated. It’s almost as if the course is becoming only secondarily a course on MATLAB and primarily a course on critical thinking and lifelong learning in a technological context.

This past week’s lab really brought that to the forefront. The lab was all about working with external data sets, and it involved students going to this web site and looking at this data set (XLS, 33 Kb) about electoral vote counts of the various states in the US (and the District of Columbia). One of the tasks asked students to make a scatterplot of the land area of the states versus their electoral vote counts. Once you make that scatterplot, it looks like this:

The reaction of most students to this plot was really surprising. Almost unanimously and without consulting each other, the reaction was: “That can’t be right.” When I’d ask them why not, they would say something like: It looks strange; or, it’s not like scatter plots I’ve done before; or, it just doesn’t look right.

The first instinct of those who felt like they had made a critical error in their plot was to ask me to verify whether or not they had gotten it right. That’s understandable, but it doesn’t go very far because I have a rule that I don’t answer “Is this right?” questions in the lab. (See the instructions in the lab assignment.) Student teams are responsible in the labs for determining by themselves the rightness or wrongness of their work. So it’s time for critical thinking to take center stage — which in this context would refer to using your brain and all available tools and information to self-verify your work. (I wrote about the idea of self-verification here using Wolfram|Alpha.)

Some of the suggestions I gave these teams were:

• Have you checked your plot against the actual data? For example, look at the outliers. Can you find them in the data set itself? And look at the main cluster of data; given a cursory glance through the data set, does it look like most states have a land area less than $10^6$ square miles and an electoral vote count of between 5 and 15?
• Have you tried to create the same scatterplot using different tools? For example, everybody in the class knows Excel (because we teach it in Calculus I); the data are in Excel already, so it would be virtually no work to make a scatterplot in Excel. Have you tried that? If so, does it look like what MATLAB is creating?
• Have you taken a moment just to think about the possible relationship between the variables, and does the shape of the data match your expectations? Probably we don’t really expect much of a relationship at all between the land area of a state and its electoral vote count, even with the outliers trimmed out, so a diffuse cloud of data markers is exactly what we want. If we got some sort of perfectly lined-up string of data points, we should be suspicious this time.

Once you phrase it like this, students pretty quickly gain confidence in their results. But, importantly, most of them have never been put into situations — at least in the classroom — where this sort of thing has been necessary. If critical thinking means anything, it means training yourself to ask questions like this and pursue their answers in an attempt to be your own judge of your work.

I was particularly surprised by the rejection of any scatter plot that doesn’t look like points on the graph of a function. “Authentic instruction” is a term without an operational definition, a lot like the term “critical thinking”, but here I think we may have a clue to what that term means. Students said their scatterplots didn’t “look right”, meaning they didn’t look like what their textbook examples had looked like, i.e. the points didn’t have an overwhelmingly strong correlation despite the existence of a few token outliers. In other words, students are trained by the use of made-up data that “right” means “strong correlation”. So when they encounter data that are very much not correlated, the scatter plot “looks wrong” rather than “looks like there’s not much correlation”. Students are somehow trained to place value judgements on scatter plots, with strong correlation = good and weak correlation = bad. I’m not sure where that perception comes from, but I bet if we gave students real data to work with, it would never take root.

1 Comment

## Resisting the urge to verify

When I am having students work on something, whether it’s homework or something done in class, I’ll get a stream of questions that are variations on:

• Is this right?
• Am I on the right track?
• Can you tell me if I am doing this correctly?

And so on. They want verification. This is perfectly natural and, to some extent, conducive to learning. But I think that we math teachers acquiesce to these kinds of requests far too often, and we continue to verify when we ought to be teaching students how to self-verify.

In the early stages of learning a concept, students need what the machine learning people call training data. They need inputs paired with correct outputs. When asked to calculate the derivative of $5x^4$, students need to know, having done what they feel is correct work, that the answer is $20x^3$. This heads off any major misconception in the formation of the concept being studied. The more complicated the concept, the more training data is useful in forming it.

But this is in the early stages of learning. Life, on the other hand, does not consist of training data. In real life, students are going to be presented with ambiguous, ill-posed problems that may not even have a single correct answer. Even if there is one, there is no authoritative voice that says definitively that their answer is right or wrong. At least, you’d have to stop and ask how you know that the authority itself is right or wrong.

So as a college professor, working with young men and women who most of them are one step away from being done with formal education, it serves no purpose — and certainly does not help students — to pretend that training, the early stage, goes on forever. At some point I must resist the urge to answer their verifying questions, despite the fact that students take great comfort in having their work verified for them by an external authority and the fact that teachers usually are perceived as being better by students the more frequently they verify.

I’ve started making the training stage and the self-verification stage explicitly distinct in my classroom teaching. In a 50-minute class, I’ll usually break down the time as follows:

I’ll spend the first 20 minutes of class focusing in on one or two main ideas for the class along with some simple exercises, a few of which I’ll do (to help students get the flow of working the exercises and to provide training data not only on the math but also on the notation and explication) and more of which they will do, providing full answers to the “Is this right?” questions along the way. Then five minutes for further Q&A or to wrap up the work.

But then the training phase is over, and students will get more complicated problems (not just exercises) and are told: I will now answer any question you have that involves clarifying the terms of the problem. But I will not answer any question of the form “Is this right?” or provide any guidance on technology use. What I will do instead, if students persist in asking “Is this right?”, is answer their questions with more questions of my own:

• Are the units working out correctly? Are you getting cubic feet for volume, meters per second for velocity, etc.?
• Did you graph the function to see if the roots are really where you say they are?
• Have you seen a problem like this before in the book, your notes, or your homework?
• Does that answer make sense in the context of the problem? Did you get a negative derivative value for a function that is visibly decreasing?
• What did Wolfram|Alpha (or Maple or MATLAB, etc.) say? *
• What do your group-mates think?

And so on. Many of these are merely ripped from the pages of Polya’s How to Solve It, which ought to be required reading of, well, everybody. In other words, in this post-training phase of the class, students must simulate life in the sense that they are relying only on their wits, their tools, their experiences, and their colleagues, and not the back-of-the-book oracle.

Also, by telling students up-front that this is how the classes are going to be structured, they get the idea that there is a time for getting verification and another time for learning how to self-verify, and hopefully they learn that the act (or at least the urge) to self-verify is something like a goal of the course.

My hope here is to provide training data of a different sort — training on how to be independent of training data. This is the only kind of preparation that makes sense for young adults heading for a world without backs of books.

* You could make a good argument that Wolfram|Alpha used in this way is just a very sophisticated “back of the book” — an oracle that students use as an authority. I think there are at least a couple of reasons why W|A is more than that, and I’ll try to address those later. But you can certainly comment about it.

2 Comments

## What is a classical education approach to mathematics?

Following up on his three posts on classical education yesterday, Gene Veith weighs in on mathematics instruction:

I admit that classical education may be lagging in the math department. The new classical schools are doing little with the Quadrivium, the other four liberal arts (arithmetic, geometry, astronomy, and music). The Trivium, which is being implemented to great effect (grammar, logic, and rhetoric), has to do with mastering language and what you can do with it. The Quadrivium has to do with mathematics (yes, even in the way music was taught).

This, I think, is the new frontier for classical educators. Yes, there is Saxon math, but it seems traditional (which is better than the contemporary), rather than classical, as such.

Prof. Veith ends with a call for ideas about how mathematics instruction would look like in a classical education setting. I left this comment:

I think a “classical” approach to teaching math would, going along with the spirit of the other classical education posts yesterday, teach the hypostatic union of content and process — the facts and the methods, yes (and without cutesy gimmicks), but also the processes of logical deduction, analytic problem-solving heuristics, and argumentation. Geometry is a very good place to start and an essential to include in any such approach. But I’d also throw in more esoteric topics as number theory and discrete math (counting and graph theory) — in whatever dosage and level is age-appropriate.

At the university level, and maybe at the high school level for kids with a good basic arithmetic background, I’d love to be able to use the book “Essential College Mathematics” by Zwier and Nyhoff as a standard one-year course in mathematics (and in place of the usual year of calculus most such students take). It’s out of print, but the chapters are on sets; cardinal numbers; the integers; logic; axiomatic systems and the mathematical method; groups; rational numbers, real numbers, and fields; analytic geometry of the line and plane; and finally functions, derivatives, and applications. You have to see how the text is written to see why it does a good job with both content and process.

(I took out the mini-rant against the gosh-awful Saxon method.)

Any thoughts from the audience here?

9 Comments

Filed under Education, Liberal arts, Math, Teaching

## A trifecta on classical education

Gene Veith, one of my favorite religious writers and the proprietor of the terrific Cranach blog (and provost at Patrick Henry College), has three quick posts today on classical education. He touches briefly on teaching content rather than process, and how classical education teaches bothl; on critical thinking; and on learning styles and the teaching of “meaning”. Some clips:

The key factor in learning is grasping meaning, a concept that evades any of these sensory approaches. (While cultivation of meaning is what classical education is all about.)

and:

More substantive scholars say that being able to think critically requires (again, see below) CONTENT. You have to think ABOUT SOMETHING. Whereas much of the critical thinking curriculum is all process, trying to provoke content-free thinking. (The classical solution: DIALECTIC, featuring questions AND answers, as in that great model of classical education, the catechism, which, properly used, helps the student answer the question, “what does this mean?”)

I am pretty sure that Prof. Veith has this overall definition of “classical education” in mind, but I am not sure exactly how he defines it. And I wonder if all of what he says still works if you replace “classical” with the more generic “liberal arts”.

Comments Off on A trifecta on classical education

Filed under Education, Higher ed, Liberal arts, Teaching

## Retrospective: Critical thinking, visualization, and physical intuition (10.11.2006)

Editorial: This is the penultimate article in the retrospective series we’ve been doing all week here at CO9s. This one takes us back to 2006 one more time.

One of the things that fascinates me most about teaching math is seeing how people acquire and use problem-solving skills. And one of the things I like to think and write about the most is how people can approach problems in different ways — especially when those ways are not the standard ways of doing so — and why students make various conceptual mistakes when they try.

This article was written after a calculus homework set involving a pretty standard intro problem about the velocity of an arrow shot straight upward on the moon. (Where the **** do we math people get these problem ideas?) I was reading James Gleick’s biography of Richard Feynman at the time and was very keen on how important visualization is in problem solving. I had also been thinking (and posting) about how the ephemeral idea of “critical thinking” really just boils down to having a good intellectual B.S. detector and having the will to question whether your thinking can possibly be right or not. Put all that together, and this article is what you get.

Side note: One of the biggest sources of traffic for this blog comes from people who enter in “velocity of an arrow shot upward on the moon” into a search engine and end up with this posting. For those of you who have found yourselves here by doing so: DO YOUR OWN HOMEWORK.

Critical thinking, visualization, and physical intuition

Originally posted: October 11, 2006

Permalink

Yesterday I posted my belief that “critical thinking” has at least as much to do with intuition as it does with what we normally call “thinking”. Namely, critical thinking has to do with — is activated by — having a sense of when something can’t possibly be right. Question: Where does that sense come from? And importantly, can it be taught? I’ve been reading through Genius: The Life and Science of Richard Feynman by James Gleick and have been struck by Feynman’s reliance upon visualization to make his dramatic contributions to quantum theory and other areas, and it makes me believe there’s a strong connection between visualization and the ability to solve problems that I’ve not heard mentioned often. Continue reading

1 Comment

Filed under Calculus, Critical thinking, Education, Math