Houston, We Have A (Testing) Problem

So I flip on the television last weekend with the intention of watching a football game, and immediately get sidetracked by Apollo 13.  What a great movie.  My favorite part is the scene in which mission control is beginning to worry about the rising carbon dioxide levels.   After the oxygen tank explosion, the crew has been forced to abandon the command module and use the lunar module (LM) as a life boat.  Unfortunately the LM consumables weren’t intended to sustain three people for four days.   Normally lithium hydroxide (LiOH) filters would absorb the carbon dioxide from the air and prevent it from reaching dangerous levels, but the canisters onboard the LM couldn’t keep up. The command module had more than enough spare LiOH canisters onboard, but these canisters were square and couldn’t fit into the holes intended for the lunar modules’ round canisters.  This problem led to NASA Flight Director Gene Kranz’s classically sarcastic quote, “Tell me this isn’t a government operation.”   Kranz then proceeds to tell his engineers, “I suggest you gentleman invent a way to put a square peg in a round hole…rapidly.”

As we all know, the NASA engineers did find a way to put a square peg in a round hole.  It’s become an iconic standard for American ingenuity.  Every time I see that scene, I ask myself the same question.  Are we developing the kind of students that can find a way to put a square peg in a round hole?  The truth is…we aren’t.  And here’s why.

Accountability systems drive instructional practices.  As much as we’d all like to believe that we aren’t teaching to the test, we are.  When teacher and principal evaluation is tied to student performance on a specific test, and teachers and principals can lose their jobs because students fail to perform well on that test, you can bet instruction will be customized to improve performance on the test.   In theory, that’s a great concept.  But when your accountability measures (high-stakes tests), only require the most basic of cognitive skills, instruction will become tailored to those skills.  And that’s the kind of student you will produce.  Students with basic cognitive skills.  Let me give you a concrete example.

Below is a sample question from the ACT.  The ACT is designed to assesses students’ academic readiness for college.  11th grade students across the country take this test each year (remember that I said 11th grade students).  Wyoming is in the process of adopting it as their measure of student performance for high schools.  It’s widely respected and scores from this test are used by a large percentage of colleges in the U.S. as an entrance standard.  Questions on this test are also typical of the types of questions you will find on accountability measures being used around the U.S. to demonstrate student knowledge and skill.

For i = the square root of -1, if 3i (2 + 5i) = x + 6i, then x = ?
A.  –15
B.  5
C.  5i
D.  15i
E.  27i

To answer the question above correctly, a student must know the relevant mathematics concept, apply it, and then correctly choose from the supplied answers.  On Bloom’s Taxonomy of cognitive skills, this problem requires students to Remember, Understand, and Apply.  The three lowest levels of thinking skills.  Now let’s take a look at a sample question from the PISA.

A result of global warming is that the ice of some glaciers is melting. Twelve years  after the ice disappears, tiny plants, called lichen, start to grow on the rocks. Each lichen grows approximately in the shape of a circle. The relationship between the diameter of this circle and the age of the lichen can be approximated with the formula:

 
where d represents the diameter of the lichen in millimetres, and t represents the number of years after the ice has disappeared.  Using the formula, calculate the diameter of the lichen, 16 years after the ice disappeared. Show your calculation.

Notice anything different?  This is the kind of questions that other countries give to 15-YEAR OLDS!   It’s also one of the tests given to U.S. students for achievement comparisons with other countries.  Questions like the one above go well beyond Remembering, Understanding, and Applying and require students to use the math in a  real-life situation.  In addition to Remembering, Understanding, and Applying, it engages students in the higher order thinking skills of Analyzing and Evaluating.  Is it any wonder our test results don’t stack up?  It’s the difference between asking a student to tell you what time it is and asking him how to build a watch.

Without intentionally doing so, our accountability tests have reverse-engineered for exactly the kind of results our business and legislative leaders don’t want.  These systems have set the  bar at basic understanding and our teachers are hitting it.  The problem is that basic skills don’t translate well when compared to what’s needed in the real world, and what other countries like Finland are doing in their education systems.  In order to be effective, our accountability measures need to ask the right kinds of questions.  When they do that, teachers will change their instruction to meet that goal and we’ll produce the kind of student that can compete globally and invent ways to put square pegs in round holes.  And until that happens, one thing’s for certain.  We’ll continue to sit squarely on the launch pad while the rest of the world reaches for the stars.  Houston, we have a problem.

Advertisements

About Jay Harnack

Superintendent of Sublette County School District #1
This entry was posted in Accountability. Bookmark the permalink.

4 Responses to Houston, We Have A (Testing) Problem

  1. fletcherturcato says:

    Nicely done…. Again!

    Sent from my iPhone

  2. Rollie Myers says:

    Good article. But being an old math teacher I still had to solve the problem for x.
    Oh, by the way, x=-15. Have a great day.

  3. Ward Wise says:

    A great point for everyone to realize.

Comments are closed.