What is a problem



Yüklə 94,87 Kb.
tarix05.12.2017
ölçüsü94,87 Kb.
#13988

The problem of so-called problems in mathematics – Grant Wiggins

An exercise is a question that tests the student’s mastery of a narrowly focused technique, usually one that was recently ‘covered’. Exercises may be hard or easy but they are never puzzling...the path toward the solution is always apparent. In contrast, a problem is a question that cannot be answered immediately. Problems are often open-ended, paradoxical, and sometimes unsolvable, and require investigation before one can come close to a solution. Problems and problem solving are at the heart of mathematics. Research mathematicians do nothing but open-ended problem solving.


– Paul Zeitz, The Art and Craft of Problem Solving p. ix
To put it simply, you have a problem when you are required to act but don’t know what to do.

– Ian Robertson, Problem Solving


Introduction
Almost every formal goal statement for mathematics programs and courses says that “problem solving” is a key goal. Here are a few recent examples:
from the recent Common Core State Standards Initiative, p. 3 DRAFT:

A primary goal of developing these standards is to enable students to achieve mathematical proficiency. Students are expected to understand the knowledge described in the Core Concepts and in the Coherent Understandings at a depth that enables them to reason with that knowledge—to analyze, interpret and evaluate mathematical problems, make deductions, and justify results. The Core Skills are meant to be used strategically and adaptively to solve problems. (available at http://www.corestandards.org/)


from the NJ State Standards (2006):

Standard 4.1: All Students Will Develop The Ability To Pose And Solve Mathematical Problems In Mathematics, Other Disciplines, And Everyday Experiences

Descriptive Statement: Problem posing and problem solving involve examining situations that arise in mathematics and other disciplines and in common experiences, describing these situations mathematically, formulating appropriate mathematical questions, and using a variety of strategies to find solutions. By developing their problem-solving skills, students will come to realize the potential usefulness of mathematics in their lives.
from NCTM Focal Points (2006) p. 10:

Three curriculum focal points are identified and described for each grade level, pre-K–8, along with connections to guide integration of the focal points at that grade level and across grade levels, to form a comprehensive mathematics curriculum. To build students’ strength in the use of mathematical processes, instruction in these content areas should incorporate—

• the use of the mathematics to solve problems;

• an application of logical reasoning to justify procedures and solutions; and

• an involvement in the design and analysis of multiple representations to learn, make connections among, and communicate about the ideas within and outside of mathematics.
The purpose of identifying these grade-level curriculum focal points and connections is to enable students to learn the content in the context of a focused and cohesive curriculum that implements problem solving, reasoning, and critical thinking.

Oddly enough, though many books have been devoted to mathematical problem solving, in them we often get no clear definition of what a real problem is and isn’t. (See, for example Teaching Mathematics Through Problem Solving: Grades 6 -12, NCTM 2003). The authors presumably assume that it is obvious what a “problem” is.


Alas, as any thorough inspection of middle and high school mathematics classes and tests shows, that presumption is problematic. Most math assessments and assignments involve relatively simple exercises and always have. (See, for example, Archbald & Grant 1999, “What’s on the Test? An Analytical Framework and Findings from an Examination of Teachers' Math Tests” in Educational Assessment, Vol. 6 #4.) Why is that?
This is not a new lament. It long pre-dates the standards and testing movement. Whitehead lamented it 100 years ago. It is visible in private as well as public schools, and in many college courses: it thus cannot just be the fear of external test results. John Goodlad puzzled over this same question more than 25 years ago in his landmark study A Place Called School:
The impression I get from the topics, materials, and tests of the curriculum is of mathematics as a body of fixed facts and skills to be acquired, not as a tool for developing a particular kind of intellectual power in the student.... Interestingly, mathematics teachers somewhat more than teachers in other academic subjects perceived themselves as seeking in their students processes related more to learning how to learn than to merely acquiring mechanics. Many wanted their students to be logical thinkers, to learn how to attack problems and to think for themselves. Why, then, did so few mathematics teachers in our sample appear to get much beyond a relatively rote kind of teaching and textbook dependency? (A Place Called School, pp. 209-210.)

This critique goes even further back. Kant, Hegel, Whitehead and Dewey all bemoaned this tendency in the teaching of math (see for example, Dewey’s long account of real problems vs. pseudo-problems in Democracy and Education). Mathematician David Hilbert, in his famous lecture of 1900 on problem solving in mathematics, noted:


The deep significance of certain problems for the advance of mathematical science in general and the important role which they play in the work of the individual investigator are not to be denied. As long as a branch of science offers an abundance of problems, so long is it alive; a lack of problems foreshadows extinction or the cessation of independent development. Just as every human undertaking pursues certain objects, so also mathematical research requires its problems. It is by the solution of problems that the investigator tests the temper of his steel; he finds new methods and new outlooks, and gains a wider and freer horizon. Bulletin of the American Mathematical Society 8 (1902), 437-479. A reprint of appears in Mathematical Developments Arising from Hilbert Problems, edited by Felix Brouder, American Mathematical Society, 1976. Available online at: http://aleph0.clarku.edu/~djoyce/hilbert/

Polya famously said that “What is know-how in mathematics? The ability to solve problems.” (Polya, Mathematical Discovery Vol 1, 1957 xi). And, as van der Lange reports, concerning the history of math reform in the 1960s:


In 1962, some 75 well-known U.S. mathematicians produced a memorandum, “On the Mathematics Curriculum of the High School,” published in the American Mathematical Monthly: To know mathematics means to be able to do mathematics: to use mathematical language with some fluency, to do problems, to criticize arguments, to find proofs, and, what may be the most important activity, to recognize a mathematical concept in, or to extract it from, a given concrete situation (Ahlfors et al. 1962, cited in PISA 2003).

Those of us who have routinely confronted the poverty of mathematics instruction and local assessment in secondary education over decades of reform work have also wondered about this puzzle of a goal at odds with practice. And while it is clearly true that math teachers rely more on textbooks than most other teachers, that fact merely begs the question: if problem solving really is the goal, why aren’t we doing more to achieve it?


The problem of non-problems

So, then this is a paradox; this is thus our problem as educators. Why are secondary math courses so devoid of real problems if the goal is problem solving and if teachers agree that this is the goal? Once we frame the question as a real problem we not so coincidentally move the inquiry forward. Surely, the distinction between exercises and problems is key. Because the paradox is becoming resolved if teachers conflate problems with exercises and/or if they understand the distinction but believe (rightly or wrongly) that they must, in the short term, focus almost exclusively on exercises instead of problems.


My aim here is not to provide an exhaustive account of the answer based on extensive research. Rather, my goal is preliminary and conceptual. I am hoping that by framing the problem of non-problems in mathematics classes in this way, that we will all be better to understand the issue of unsatisfactory mathematics performance and see more clearly how to improve it.
Both conditions apply, I think. Based on over three decades in secondary classrooms in numerous reform projects and as a consultant to textbook companies I see and hear that many teachers appear to conflate problems with exercises, and they believe that they “have to” emphasize discrete skill development in their courses regardless of their other desires. Alas, I also think that many teachers incorrectly believe that their job is to develop mere isolated skill ability via exercises. The most common rationalization – “We have to do this because of the state tests!” – also turns out to be ironically incorrect: the opposite is true. As inspection shows, the most difficult test items on state, national, and international tests (as measured by % correct) all involve problems as opposed to exercises, as I will sketch out below. Thus, we have another problem: why do teachers think that tests reward only practice in exercises when released tests show this not to be true?
Whatever the answers to the source of the teachers’ views of their job, a key solution involves what we have long called “Backward Design” (see Understanding by Design 2005/1998). At the school level the job of math teacher should be explicitly clarified as aimed toward “real” problem solving, and all courses in middle school and high school should be designed backward from real problems and their solution, not from discrete topics and simple exercises.
What is a problem? A closer look.

Let’s reconsider the foundational question more concretely, based on Zeitz’ distinction: what are real problems for math courses of study and how do they differ from helpful exercises in support of that goal? If a real problem is often “open-ended, paradoxical, and requires investigation,” as Zeitz argues and common sense suggests, then what would such problems be in algebra, geometry, and other courses? I have put this question to hundreds of mathematics teachers; few have good answers.


That inability to quickly answer the question is, itself, a problem! And it offers a clue, I believe: it begins to lend credence to the hunch that many teachers conflate problems with exercises – a hunch borne out by looking at local assessments. In my recent sampling of HS algebra and geometry exams from 5 schools there was not one real problem on them. All of my colleagues who work in many schools around the country report the same paucity of problems.
What are we talking about specifically, then, when we contrast real problems with exercises? Let’s make some common sense distinctions, using the Zeitz criteria:


Problem

Exercise

A puzzle: not all needed information is provided explicitly. Some “investigation”, inference, logic, and filling in of what is implicit is required. Lots of prior knowledge is tapped and tested.

Everyday example: a crossword puzzle



Straightforward: Once the right algorithm is found in memory, we should not feel stopped in our tracks by the demand. Having identified the challenge type and recalled the algorithm, there is no uncertainty; we just need to “plug and chug.”

Everyday example: Jeopardy question



Paradoxical: on the surface, it seems unsolvable, self-contradictory, i.e. involves leading seemingly impossible conclusions, logic, or assumptions. More broadly it must be a “non-routine” problem.

e.g. “Take the number 4, and using it five times, write a mathematical expression that will give you the number 55.”1



Prompted unambiguous request: once we understand the prompt, we should not be confused or puzzled by the demands; it looks routine.

e.g. what number, when added to itself 4 times = 55?



Solution path not apparent: Even when I understand the problem statement and the givens have been explicated I still may not know exactly how to proceed; there are varied plausible approaches; multiple approaches may work; and different approaches may have different strengths and weaknesses.

e.g. “No basic operations with integers work; how do you get an odd number result out of only even numbers?? Would fractions work? Are there other operations besides the four basic ones that I am overlooking?”



Solution path is apparent: once recall is tapped properly, the method for finding the answer is known or quickly knowable.

e.g. x + x + x + x + x = 55



Open-ended: once into it, the work may require us to re-frame the problem statement, consider what counts as an answer, and consider that the answer may end up being: “well, it depends...”

e.g.: “How much does it cost to take a shower?”



Convergent: there is a single and unequivocal right answer.

e.g. “Which uses more water: the average bath or average shower?”



Have to do some investigation: Until we mess around with sample numbers or figures we don’t have any clear sense of how to proceed or perhaps even what the problem is really asking.

e.g. How big a warehouse is needed to store a week’s worth of newsprint for use in printing presses by a major daily newspaper in a big city?



Little or no investigation required: once the problem is cast in complete mathematical terms, we know how to proceed.

e.g. Which is a bigger area: A 3 foot square figure or an equilateral triangle in which each side is 4 feet long?

REAL PROBLEMS. Before considering what might be real problems vs. exercises in a typical mathematics course, we should say a bit more about genuine problems in one’s life. What is problem solving in the broadest sense? In other words, if I came up to you and said: “I have a problem. Can I discuss it with you?” what would you expect us to consider and to do?


In this case, you would probably expect me to lay out a frustrating situation – say, a struggle in raising my oldest child – in which the no approach seems on the surface to be adequate. I would also likely express some uncertainty about how to weight the options, i.e. there is some uncertainty about approach. Together we would look at many (plausible) alternatives – none of which would likely seem immediately and unqualifiedly correct. We would consider the pros and cons of different angles and approaches. You would draw upon the most relevant prior knowledge and skill you possess to help me explicate, recognize, and weigh options. And we would explore any key implicit assumptions that I may have snuck in or failed to consider. In short, in the real world a “problem” is just what Robertson says it is: I know I must act, but I don’t know what to do (or even on first blush what counts as a wise approach in this situation.)
What wouldn’t we do? We would not consult the textbook on parenting for an algorithm. Nor would we refer back to a single formula that we were once taught – e.g. “spare the rod and spoil the child” – and plug it in. Though we would not solve a real problem this way, this is what counts as problem solving in math class.
Let’s take a real-world example of problem-solving closer to mathematics: troubleshooting a problem with our car. Many readers have no doubt listened to Car Talk on NPR. Tom and Ray invite listeners to share their problems with cars (and life!). In all cases, there is a puzzle: a strange noise that occurs under certain conditions; a loss of power under unclear circumstances for no apparent reason, a failure to start on certain days, etc. In every case, the brothers ask pointed clarifying questions in order to better grasp the facts and the salient terms of the problem, the conditions under which the problem exists, and propose a variety of possible causes and solutions. Most importantly, they also propose ways of testing out their emerging theories with If...Then... conditional reasoning.
Our own problem – the goal of math education is problem solving; teachers agree with the goal; but we rarely see real problems on local assessments and assignments – is also a good example of a real problem. It is indeed paradoxical to think that we somehow unwittingly work against our own goals as teachers, but that is what we seem to do.
The designers of the PISA international assessment, in articulating their design challenge, provide us with a helpful set of conditions for thinking more generally about the design of such problems on assessments:
A problem is not immediately resolvable through the application of some defined process that the student has studied, and probably practised, at school. The problems should present new types of questions requiring the student to work out what to do. This is what causes the item really to be a problem-solving item. Such problems call on individuals to move among different, but sometimes related, representations and to exhibit a certain degree of flexibility in the ways in which they access, manage, evaluate and reflect on information.

© OECD 2004 Problem Solving for Tomorrow’s World – First Measures of Cross-Curricular Competencies from PISA 2003
The pursuit of the problem of non-problems in mathematics can perhaps be addressed by initially looking at illustrative parallel cases where the problem of non-problems doesn’t exist in schools. One need turn no farther than English, art, or technical courses. Writing a paper, making a logo, or controlling a robot by coding are all problem-solving tasks in our sense. All vocational, sports and performing arts classes require us to use content to solve performance challenges.
The contrast suggests a likely source of our problem: we have historically defined all those non-math academic subjects as performance-focused (where content is the means, not the end) while defining “math” as a list of discrete content to master. Thus, our problem must be in part curricular: math courses are typically designed as a linear march through discrete inputs in isolation instead of “backward” from the output we seek.
The successful two-decade movement in medical and engineering schools in Problem-Based Learning lends further credence to seeing the problem this way: once we define a course as not the content but the performance demands; once we recall that a subject is a “discipline” i.e. a disciplined way of handling challenges, we see how to make the changes necessary to match means with ends in mathematics.
We are thus looking for a course foundation equivalent of the child-rearing, car-trouble-shooting and math-education problems in secondary mathematics assessment and instruction, by which we can focus and ground the curriculum to make it more likely that students encounter real problems on a regular basis and where they graduate from our courses and programs as better problem solvers – by design, not by good fortune or native ability and interest only.
Real pure problems – not just applied math

A cautionary note, to forestall a possible misconception: I am not suggesting that only “real-world” problems immediately relevant to kids’ experiences count as “real problems” in mathematics. There is a limitless number of purely theoretical problems that mathematics students should encounter as part of a good K-12 education (e.g. the kinds of problems often found in math competitions). I am arguing that students rarely see real problems of any kind, applied or pure. They mostly confront simple math exercises, whether the content is “pure” or “applied.”


It is thus a false dichotomy is set up the issue at hand as a debate between use of “real math” vs. “real-world” problems. Part of our problem is that most so-called “real math problems” presented by teachers are rarely real problems.
As an example of a good pure math problem, one of my favorites asks a simple but curious question about the Pythagorean theorem: We know that A2 + B2 = C2. But let us remind ourselves what that literally means: “In a right triangle, the area of the figure drawn on the on the hypotenuse (in this case, a square) is equal to the sum of the areas of the squares on the other two legs.” Here is the problem: does the figure whose areas we compare, drawn on the triangle legs, have to be square? Can there be other shapes – triangles, rhombuses, regular pentagons, etc. – that make the Pythagorean Theorem more generally true?
This is a good example of a real problem in pure math. We want to see how far we can go with a conjecture in which the outcome and solution path are unclear because more power and elegance lies on the other side. What is the more general relationship? How far might the Pythagorean pattern extend? Here, too there is apparent paradox – figures other than squares??? We have to first mess around, try out some lines of inquiry and examples, and make sure we have correctly grasped the givens and the problem before we determine how to proceed. Here, too, we have to develop potential solution paths and defensible generalizations, not recall and use a prior template unthinkingly. By contrast, in an exercise, all we need to do is note that the Pythagorean relationship applies, and then do the one or two calculations needed to find the unequivocal answer.
Spoiler alert: don’t read the next two sentences if you haven’t done this problem and want to work it out for yourself. The exciting moment comes – if it comes – when the student breaks the self-imposed barrier of investigating only figures with straight lines. The theorem works, for example, if we use semi-circles instead of squares.
I picked the Car Talk radio show as a resource earlier for another reason. Coincidentally, Car Talk’s most notable feature beyond the interchange with listeners about car woes is the weekly Puzzler. And many of those puzzlers are pure-math-related (as befits alumni of MIT). Here is a recent one that builds off the example I gave above in the chart:
RAY: This was from my number 4 series. There's an old puzzler from long, long ago, and it went like this: Take the number 4, and using it five times, write a mathematical expression that will give you the number 55. Now it's pretty challenging because 4's an even number and 5's an odd number. How do you get 55 out of five 4's? And the answer is 44 + 44/4.
Jim Adams suggested something a little more challenging. He says, 'Take the number 4, use it 3 times, and write a mathematical expression that will give you the number 55.'
Now you can use any of the mathematical terms that are usually used in writing mathematical expressions: plus sign, minus sign, multiplication, division, square root -- you can't square anything because you'd be introducing the number 2 then. You can only use 4s. Your expression could have any of the symbols. You could have 4 to the 4th, you could have 4 factorial, you could have 4.4. You can use anything that you'd see in a mathematical expression as fair game.
What's the equation?

http://www.cartalk.com/content/puzzler/transcripts/200945/index.html

In short, we will get more problem solvers the more secondary math courses are framed around real problems, pure and applied; in which it is clear to students and parents as well as teachers and supervisors, that this is the aim of the courses.

The ironic excuse of standardized tests, if problem solving is our goal.

I know from painful experience that this reasoning often falls on deaf ears. “Grant, this is all well and good, but the state/provincial/national tests demand that we focus on algorithms and facts. There is no time for real problem solving and it won’t pay off on test day.” For a moment, let’s leave aside the question of the truth of this claim and simply note the consequences of the belief for our problem. If teachers believe, rightly or wrongly, that only exercises pay off on tests, then we have a clear and plausible explanation for the absence of problems in math class. Because whether or not teachers conflate problems with exercises, if this claim is universally believed, it explains the absence.


Ironically, this is an example of a “plausible claim” that I think turns out to be incorrect, despite conventional wisdom. Yes, I know: you are sure I am wrong and that state tests are the problem. But hear me out. Perhaps your reaction is like that of the solver whose first reaction to the problem of the four 4s is that it can’t be solved.
As I have elsewhere argued (“Time to Stop Bashing the Tests,” in the March 2010 issue of Educational Leadership), a look at released test items shows that there are numerous higher-order questions, and disappointingly low performance results on inferential and transfer questions is the norm. (We look at some items, below). Can that be? Is the student actually confronted with far more real problems on external tests than on home-grown assessments than the lore would have us believe? A look at the released tests from states that release most of their tests and provide item analysis for each question suggests that the paradoxical answer is “yes.”
Before we look at the items, consider the conditions under which standardized tests are given compared to those in which teachers give quizzes and tests in school. In a state test, the students lack all the usual context clues: what we just covered, what the teacher said would be on the test, etc. are all missing. On standardized tests, many of the items have been deliberately designed to look somewhat unfamiliar. In-class tests rarely put students in those conditions. Another critical issue is the art of writing good distractors. On all state and national tests one or more of the distractors has been carefully chosen to be highly plausible. Overall, then, any standardized test itself – regardless of content - is thus actually a problem situation!
Consider these questions that confront the students as they look through the test booklet:


  • Where in all our textbook topics and chapters is this question from?

  • Did we study this topic?

  • What does this remind me of? (One of Polya’s great heuristic questions). The way some of these questions are framed is unfamiliar to me.

  • What do I do when I am stuck? I don’t have a teacher or textbook to turn to, and I can’t talk to anyone else.

Now, imagine if the students have only confronted exercises that look very similar to the work just covered in class. When the item is both out of context and lacking complete scaffold or simplifying directions and hints, the test question is far more demanding than a plug and chug exercise.


So, how do our students do on the real problems in standardized tests? Poorly. Here are three simple illustrations from different tests. For the sake of clarity I have chosen items involving the Pythagorean theorem. Why? Because every student has encountered the content, multiple times; almost no one fails to answer the question: “What does the Pythagorean Theorem state?” However, when called upon to solve relatively easy problems involving the theorem on state tests, where the illustrations provided are likely to be somewhat unfamiliar (without being unfairly alien) and with the key demands implicit, students fare pretty badly. Students, in other words, are poor at inferring when the Pythagorean Theorem is or might be involved, and then planning and solving accordingly:
Ohio 8th grade:

screen shot 2009-10-05 at 12
screen shot 2009-10-05 at 12
Florida 9th grade

screen shot 2010-05-02 at 2
screen shot 2010-05-02 at 2
Massachusetts 10th grade:
On the coordinate plane, what is the distance between the points (3, 4) and (11, 10)? You may want to use the grid below to answer the question.

A. 10* - 33%

B. 7 – 35%

C. 5 – 7%

D. 14 – 24%
The last one should be particularly irksome to teachers. A tenth grader has been taught both algebraic and geometric approaches to such problems. Even if you forget the Distance Formula, the formula derives from the Pythagorean Theorem which can let you solve the problem. (Note, too, that when this blank grid is altered to look more familiar, i.e. with the X and Y axis drawn in, student performance is greatly improved!)
Worse, on almost all math tests, the test booklet for students contains a reference sheet in the front or back. Here is the one from Florida:
screen shot 2010-05-04 at 11

When we review an entire released test (as Massachusetts has allowed up until this year), what do we find? In looking at the Spring 2009 10th grade mathematics I coded the 42 items as involving 11 problems versus 31 exercises; if I was in doubt, I erred on the side of “exercise”. The results on the 11 problems were decidedly worse than on the exercises: on average only about 53% of student answers were correct compared to a mean score of 71% on the exercises. Noteworthy, too was that the hardest problems were multiple-choice, not the short constructed-response word problems, as lore would have it. Many teachers may thus be incorrectly blaming external tests for their own habit of sticking to low-level exercises instead of responding to the fact that the most challenging questions on external tests are all problems. (To find this and other specific tests and results, go to: http://www.doe.mass.edu/mcas/search/)


Here were two problems, with % correct:
:#24 % increase 2010-05-09 at 7.59.06 am.png

:#8 3 vertices 2010-05-09 at 7.56.43 am.png
Here were two exercises, with % correct:

:#9 mean 2010-05-09 at 7.57.07 am.png

:#15 parllgm perimeter 2010-05-09 at 7.58.00 am.pdf
The problems don’t even qualify as challenging problems, given our earlier criteria. Some, though not very much, investigation and multi-step setup is required based on a few inferences. Then, the student must judge which algorithm is involved, though by this point it is expected to be fairly obvious. So, there aren’t even highly-puzzling problems on the MCAS test (compared, say to the ones on TIMSS and PISA). But the problems were hard enough to stump almost half of Massachusetts 10th graders – the state with the highest performance levels in the US, according to NAEP results.
Our problem of non-problems in math class is thus pressing. Math teachers want to avoid poor performance on tests because of their lofty goal – developing problem solvers – and also for practical extrinsic reasons – do well on the test. But external tests contain a number of problems for which students are clearly not prepared. Surely, then, there is a set of unhelpful beliefs or lack of understanding influencing math teachers; clearly, there needs to be an overhaul of local courses and (especially) local assessments to ensure a steady dose of real problems. At the very least, secondary math teachers might want to explore using something like the ratio of exercises to problems I found: about a quarter of their own tests should involve real problems. And a half of those should be highly challenging, if the goal is problem solving, not plug and chug facility merely. The first 3 levels of Webb’s Depth of Knowledge, discussed further below, can serve as an operational approach to such design, self-assessment, and peer review of one’s tests.
(Here, too, is a nice large-scale research study: in Florida, Massachusetts, Ohio and other states where items are released at this level of detail, are we seeing more genuine problem solving-related instruction and assessment locally over time? Or are teachers still feeding kids a steady dose of exercises in spite of what the tests clearly demand? The answer would go a long way towards clarifying and addressing our national problem of poor math performance.)
Regardless of what teachers are doing and with what rationale, a quick look at textbooks in common use reveals another part of the problem. Most textbooks offer few problems in their unit and chapter assessments. So, if teachers draw their course plans and unit assessments exclusively from the book (as most do), local assessment will end up as poor preparation for standards-based state tests with problems.
What are typical kinds of Pythagorean “problems” (i.e. exercises) found in most HS geometry textbooks? Here are three, all of which occur right after instruction and practice exercises in the Pythagorean Theorem:


  1. How far is the sailboat from the lighthouse, to the nearest kilometer?

screen shot 2010-05-07 at 5.43.34 pm.png

2. Find the perimeter and area of this triangle:



screen shot 2010-05-07 at 5.46.09 pm.png
3. Find the length of a diagonal brace for a rectangular gate that is 5 feet by 4 feet. Round to the nearest tenth.
screen shot 2010-05-07 at 5.49.27 pm.png
No wonder students struggle with even the simplest applications of the Pythagorean theorem given typical instruction.

What is a math problem? Some examples

So, what must we see more of? Let’s look at real problems in secondary mathematics. The first three problems sketched below involve applied mathematics and the remainder involve pure or mixed mathematics:




  1. How much skin does an average sized person have?

  2. How much available landfill volume is needed to handle the waste generated each year by our school?

  3. What’s the fairest way to rank order teams where many don’t directly play one other (e.g. national college basketball during the season)?

  4. Among grandfather’s papers a bill was found: 72 turkeys $_67.9_ The first and last digit of the number that obviously represented the total price of those fowls are replaced here by blanks, for they are faded and are now illegible. What are the two faded digits and what was the price of one turkey?

  5. The length of the perimeter of a right triangle is 60 inches and the length of the altitude perpendicular to the hypotheneuse is 12 inches. Find the lengths of the sides of the triangle.

  6. Find three consecutive odd numbers whose sum is 627.

  7. A train is leaving in 11 minutes and you are one mile from the station. Assuming you can walk at 4 mph and run at 8 mph, how much time can you afford to walk before you must begin to run in order to catch the train?

  8. Pick any number. Add 4 to it and then double your answer. Now subtract 6 from that result and divide your new answer by 2. Write down your answer. Repeat these steps with another number. Continue with a few more numbers, comparing your final answer with your original number. Is there a pattern to your answers? Can you prove it?

You might not love all eight of these, but they surely fit the criteria reasonably well. The solution path is neither stated nor painfully obvious; there is a bit of a puzzle in each one (usually in terms of implicit assumptions and unobvious solution paths); some seem unsolvable at first glance; and the solution will depend upon some mucking around as well as the development of and careful testing of a strategy.


The middle two examples come from the famous Stanford University Competitive mathematics Examination for high school students, developed by famed writer on heuristics Georg Polya. (The Stanford Mathematics Problem Book, Polya & Kilpatrick, Dover 1974).
The last three problems are noteworthy for different reasons. They are excerpted from the published problem sets given to all 9th grade math students at Phillips Exeter Academy. Math class at Exeter is entirely problem based. Students are given these problem sets each week, and homework consists in being prepared to offer your approach and solutions (or difficulties) in class the next day. In short, Exeter (arguably one of the best schools in the United States) takes it as a given that the point of math class is to learn to solve problems. Content lessons often follow upon the attempts to solve them rather than always preceding them.
Their detailed departmental mission statement makes their aim and methods clear:
The goal of the Mathematics Department is that all of our students understand and appreciate the mathematics they are studying; that they can read it, write it, explore it, and communicate it with confidence; and that they will be able to use mathematics as they need to in their lives.

We believe that problem solving (investigating, conjecturing, predicting, analyzing, and verifying), followed by a well-reasoned presentation of results, is central to the process of learning mathematics, and that this learning happens most effectively in a cooperative, student-centered classroom.

Our intention is to have students assume responsibility for the mathematics they explore—to understand theorems that are developed, to be able to use techniques appropriately, to know how to test results for reasonability, to learn to use technology appropriately, and to welcome new challenges whose outcomes are unknown.

To implement this educational philosophy, members of the PEA Mathematics Department have composed problems for nearly every course that we offer. The problems require that students read carefully, as all pertinent information is contained within the text of the problems themselves—there is no external annotation. The resulting curriculum is problem-centered rather than topic-centered. The purpose of this format is to have students continually encounter mathematics set in meaningful contexts, enabling them to draw, and then verify, their own conclusions.

As in most Academy classes, mathematics is studied seminar-style. This pedagogy demands that students be active contributors in class each day; they are expected to ask questions, to share their results with their classmates, and to be prime movers of each day’s investigations. The benefit of such participation in the students’ study of mathematics is an enhanced ability to ask effective questions, to answer fellow students’ inquiries, and to critically assess and present their own work. The goal is that the students, not the teacher or a textbook, be the source of mathematical knowledge. http://www.exeter.edu/academics/84_801.aspx

Is it the teacher’s fault?

Again, our problem: why is it so rare for an approach like this to happen in typical classrooms? Once you see what Exeter is doing, you cannot help but wonder: why is this so atypical? Especially since this is what a student can expect in college math and science courses. The question and college-prep context suggest another disturbing possible avenue: many typical math teachers have never done mathematical problem solving at a high and consistent level in college and prior work experience. Most math teachers know only what is in typical textbooks. By contrast, half of Exeter’s faculty has advanced degrees in mathematics. The issue of adequate technical background must be faced squarely and unapologetically by mathematics supervisors and writers of curriculum. Merely writing curricula based on our own background may not be enough to address Standards and need.


A workable solution becomes clearer. Regardless of teacher background, we can and must write curriculum differently. Most local curricula (hence, tests) – even in the better public and private schools – are written “backward” from topics, not backward from problems that demand topical knowledge, as in the Exeter case. This is the essential lesson of Understanding by Design (UbD) and “backward design.” Once this understanding of what curriculum needs to be is put forward as a design challenge, our experience in UbD suggests that a team of teachers under this directive can come up with better assessments and units of study through the shared experience and talent of the group. Hundreds of such problems can be found in the many problem-related and math-competition-related books available. (See, for example Polya and Kilpatrick, Posamentier & Salkind, Kordemsky, Batterson, Steinhaus and Field). Conversely, allowing teachers to plan, teach, and test in isolation is a recipe for continued poor mathematics performance.
The problem of problem-less math courses can be traced to an important unexamined “given” that gets snuck into almost all local curriculum design (as well as the writing of standards and textbooks. We postulate without thinking that a curriculum is framed as an analytical list of topics to set up a march through the content. This is a faulty premise. If our goal is complex student performance, then we have to design backward from complex performance goals – problem solving, here. Designing courses backward from discrete topics and discrete exercises can never developing the ability to choose from and use a repertoire in a problem that cuts across numerous topics, as most real problems demand.
In other words, in a true problem-solving-based curriculum, the student must always be required to judge – “Hmm, what is this problem about? It isn’t at all clear which of my prior learning is involved here; I’ll need to test out some lines of inquiry.” Until and unless mathematics assessments are built to require such higher-order thought, and constructed before curricular frameworks, we should not be surprised by frustratingly inadequate student achievement on challenging tests. (See Chapters 3-5 in Schooling by Design for more on designing curricula backward from “cornerstone” tasks and related rubrics).

Problem solving and transfer

The problem of our students not “doing” mathematics often enough and thus well enough can be more vividly cast in athletic terms to better appreciate that this is not about process over content but about using content to solve problems. The typical math experience never lets math students play the “game” of math. They are stuck on the sidelines doing simplistic “drills” day after day. They are learning content but not learning how to transfer their prior learning to new, unfamiliar problems. (See Wiggins on Quantitative Literacy at http://www.maa.org/SAUM/articles/wigginsbiotwocol.htm)


A true story from my soccer coaching days illustrates that the underlying issue is a failure of transfer. We had been working hard in practice on drills related to creating space to make ball advancement and scoring threats more likely. But in the next game, none of what we had worked on was being transferred. I grew frustrated, especially at my captain, and yelled: “Liz!! All the things we worked on all week!” She yelled back, in the middle of the game: “I would, Mr. Wiggins, but the other team isn’t lining up the way we did the drills!”
That’s the problem of math education in a nutshell. Many math teachers and most students incorrectly think that ability development is repetition of skill in simplified and isolated exercises until a set of discrete skills is automatic on the most typical exercises. Though necessary, skill automaticity is not sufficient, however, to develop problem solving ability – transfer of learning - in novel future contexts. Perhaps if math teachers were to more fully grasp that the goal of “problem solving” is really a sub-set of the goal of transfer, they would more likely see that their exercises are insufficient to cause transfer of learning. Anecdotally, I can report that this is highly useful construct. Once I frame teaching as the challenge of teaching for transfer, most teachers instantly see that their most frustrating experiences involve students’ failure to transfer their learning. And they more easily see that the most typical instructional approaches will not yield that transfer.
We should note that this unending regimen of mere sideline practice of exercises not only fails to prepare students for real problem solving and higher-level courses, it greatly reduces the likelihood of engagement. How many soccer or basketball players would do years of exercises without being allowed to play the game until some arbitrary standard of ability were established? Is it any wonder, then, that so many students dislike mathematics? (See “Lockhart’s Lament” and “Why is Mathematics So Boring?”)

http://www.maa.org/devlin/LockhartsLament.pdf

http://golem.ph.utexas.edu/category/2007/04/why_mathematics_is_boring.html

http://www.telegraph.co.uk/education/3136861/If-maths-is-boring-what-is-the-answer.html


Transfer is notoriously difficult to achieve, as researchers from Thorndike to the present have reminded us (see, especially the chapter on transfer in How People Learn, National Academy of Sciences 2001). What we do know is that set exercises alone never yield success; they are necessary but not sufficient. In fact, learning how to do things one way undercuts transfer, as the research makes clear. Successful transfer requires practice and increased responsibility over time in trying to apply prior learning to varied and novel situations. (Recall that this was Bloom’s definition of application). The aim of school is not to get good at school.
How is transfer best approached as an educational goal? The following summary comes from How People Learn, from the National Academy of Sciences:
Research has indicated that transfer across contexts is especially difficult when a subject is taught only in a single context rather than in multiple contexts. When a subject is taught in multiple contexts, and includes examples that demonstrate wide application of what is being taught, people are more likely to abstract the relevant features of concepts and to develop a flexible representation of knowledge.
Varying the conditions under which learning takes place makes learning harder for learners but results in better learning. Like practice at retrieval, varied learning conditions pay high dividends for the effort exerted. In the jargon of cognitive psychology, when learning occurs under varied conditions, key ideas have "multiple retrieval cues" and thus are more "available" in memory.
Students develop flexible understanding of when, where, why, and how to use their knowledge to solve new problems if they [are instructed in] how to extract underlying themes and principles from their learning exercises.”
Transfer is also enhanced by instruction that helps students represent problems at higher levels of abstraction. Helping students represent their solution strategies at a more general level can help them increase the probability of positive transfer and decrease the degree to which a previous solution strategy is used inappropriately.
That’s why in games, by the way, coaches always require scrimmages – practice games, i.e. practice in soccer problem-solving and transfer – and help students deliberately reflect upon and learn from the feedback of trying to use knowledge in the game. That’s also why there is a season of games: both validity and reliability in soccer (as in any field) require students to play the real game over time and with great consistency and competence if real achievement is to be sought, measured, and achieved. By contrast, naïve coaches expect the drills to always smoothly transfer to the game – with predictably disappointing results (as soccer experience and the entire transfer literature in education reveal).
The Exeter approach can thus be understood as what scrimmages enable in athletics: each of the many problem sets (and which notably have recurring types of problems in different guises over time) is just practice in transfer and on-field problem solving in which players not only use a repertoire but develop the needed self-assessment skills for later tests of problem-solving abilities.
In short, assessment in mathematics as in any field should be forward looking, not a look back at what was just covered. As the PISA program puts it:
PISA seeks to measure how well young adults, at age 15 – and therefore approaching the end of compulsory schooling – are prepared to meet the challenges of today’s knowledge societies. The assessment is forward-looking, focusing on young people’s ability to use their knowledge and skills to meet real-life challenges, rather than just examining the extent to which they have mastered a specific school curriculum. This orientation reflects a change in the goals and objectives of curricula themselves, which are increasingly concerned with how students use what they learn at school, and not merely whether they can reproduce what they have learned.

PISA: p. 12 2003


A preliminary solution: model problems, model rubrics, and annotated examples of solutions

How might we begin to map out what a truly problem-based approach to learning mathematics might require of us – with no sacrifice to core content? The varied examples and analogies suggest a framework. We will need a full set of problems of related content and increasing difficulty that embody national standards, and a set of rubrics for assessing such problem solving. In addition, we will need a set of annotated problem solutions in which a full range of solutions is analyzed explicitly (as happens on the best state and national test released items).


The materials developed by PISA and Webb on Depth of Knowledge offer a very helpful start on the rubric front. The PISA material also maps out the broad-brush elements for design of real problems:

[T]he tasks included in the assessment were selected to collect evidence of students’ knowledge and skills associated with the problem-solving process. In particular, students had to demonstrate that they could:

Understand the problem: This included understanding text, diagrams, formulas or tabular information and drawing inferences from them; relating information from various sources; demonstrating understanding of relevant concepts; and using information from students’ background knowledge to understand the information given.

Characterise the problem: This included identifying the variables in the problem and noting their interrelationships; making decisions about which variables are relevant and irrelevant; constructing hypotheses; and retrieving, organising, considering and critically evaluating contextual information.

Represent the problem: This included constructing tabular, graphical, symbolic or verbal representations; applying a given external representation to the solution of the problem; and shifting between representational formats.

Solve the problem: This included making decisions (in the case of decision making); analysing a system or designing a system to meet certain goals (in the case of system analysis and design); and diagnosing and proposing a solution (in the case of trouble shooting).

− Reflect on the solution: This included examining solutions and looking for additional information or clarification; evaluating solutions from different perspectives in an attempt to restructure the solutions and making them more socially or technically acceptable; and justifying solutions.

Communicate the problem solution: This included selecting appropriate media and representations to express and to communicate solutions to an outside audience. (P26 – 27, The PISA 2003 Assessment Framework: Mathematics, Reading, Science and Problem Solving Knowledge and Skills (OECD, 2003b)



© OECD 2004 Problem Solving for Tomorrow’s World – First Measures of Cross-Curricular Competencies from PISA 2003


Earlier, I suggested that we could use the Webb rubrics as a basis for designing and troubleshooting local mathematics assessment. (Note that many state assessments now not only use Webb rubrics but code the test questions and standards in terms of these levels.) If we follow my rough rule that 25% of any local exam or test should involve problems in addition to exercises, and half of those problems should be “challenging,” then the first three levels of Webb rubrics for mathematics show how this could be operationalized:



Mathematics depth-of-knowledge levels
Level 1 (Recall) includes the recall of information such as a fact, definition, term, or a simple procedure, as well as performing a simple algorithm or applying a formula. That is, in mathematics a one-step, well-defined, and straight algorithmic procedure should be included at this lowest level. Other key words that signify a Level 1 include “identify,” “recall,” “recognize,” “use,” and “measure.” Verbs such as “describe” and “explain” could be classified at different levels depending on what is to be described and explained.
Level 2 (Skill/Concept) includes the engagement of some mental processing beyond a habitual response. A Level 2 assessment item requires students to make some decisions as to how to approach the problem or activity, whereas Level 1 requires students to demonstrate a rote response, perform a well-known algorithm, follow a set procedure (like a recipe), or perform a clearly defined series of steps...
Level 3 (Strategic Thinking) requires reasoning, planning, using evidence, and a higher level of thinking than the previous two levels. In most instances, requiring students to explain their thinking is a Level 3. Activities that require students to make conjectures are also at this level. The cognitive demands at Level 3 are complex and abstract. The complexity does not result from the fact that there are multiple answers, a possibility for both Levels 1 and 2, but because the task requires more demanding reasoning. An activity, however, that has more than one possible answer and requires students to justify the response they give would most likely be a Level 3. Other Level 3 activities include drawing conclusions from observations; citing evidence and developing a logical argument for concepts; explaining phenomena in terms of concepts; and using concepts to solve problems.

from “Depth-of-Knowledge Levels for Four Content Areas”


March 28, 2002

http://facstaff.wcer.wisc.edu/normw/



We can do better; we must do better. We have a problem; let’s solve it.


1 Taken from a “puzzler” on the NPR show Car Talk, and discussed further, below.

Page of

Yüklə 94,87 Kb.

Dostları ilə paylaş:




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©genderi.org 2024
rəhbərliyinə müraciət

    Ana səhifə