Dust flux, Vostok ice core

Dust flux, Vostok ice core
Two dimensional phase space reconstruction of dust flux from the Vostok core over the period 186-4 ka using the time derivative method. Dust flux on the x-axis, rate of change is on the y-axis. From Gipp (2001).

Saturday, June 2, 2012

Fairness and mismatched incentives in education

The kids finished their EQAO exams last week. This is what passes for a provincial exam. But here's the thing. The kids are told that their marks won't count toward their evaluation. But at the same time, they are told that it is really important that they do well, although they are not told why that is the case.

Oh wait! Here's a recent news release.

TORONTO, May 31, 2012 — Starting today, Ontario’s Grade 9 students begin writing the provincial math assessment. As they do, an Education Quality and Accountability Office (EQAO) study released today shows that teachers can help them do better on the test by delivering a simple non-math message: “This test counts toward your final mark.”


EQAO’s research revealed that when teachers of academic math said they would count the assessment toward final grades, and their students were aware of that fact, 87.5% of them met or exceeded the provincial standard. By comparison, 75.9% of students in that course met the standard when they said they were not aware of their teacher’s intention to count the test—a difference of 11.6 percentage points.

What genius! Why didn't they think of that before? Over the past years I can't say how many times I have heard frustrated teachers concerned about the extra training they have to undergo if the results at their school fall below provincial expectations. They were also desperately afraid that their students would find out that their performance might doom their teachers to extra training. It certainly is possible that students might deliberately do poorly to spite their teachers.

Of course the incentives are all wrong. The incentives encourage cheating, not by students, but by teachers. What kind of example does this set for our young people?

Years ago, there really were province-wide exams. They fell out of favour some time ago for reasons that I have never heard elucidated convincingly. I have always thought it was driven by teachers who feared they would be judged against teachers at other schools where the students performed better. This was reported to be unfair because poverty, diet, and poor family situations all have a negative impact on students' performance.

At the same time, I'm pretty sure that some parents felt the province-wide exams were unfair as well. Some schools are cursed with substandard teachers whose students would be at a disadvantage in comparison to those in better schools. However, this sort of unfairness is easy to sort out--students within each school can be assigned a mark based on a Z-score (a measure of how many standard deviations above or below the mean) based on all students from their school. The province can then stipulate a mean (68% for instance) and a standard deviation (8-9%) and the Z-scores used to compute marks. The unaltered marks could still be used to determine which schools need (ahem) extra help.

The method above will level out all differences in marks between schools, but then a new source of unfairness arises. Some schools are better than others, perhaps so much so that it may be that all the students in school A, if objectively graded, should qualify for university entrance, whereas the students in school B are so abysmal that none of them should be admitted. In the test given above, an equal fraction of students from A and B will qualify for university.

I get the impression that that is a kind of unfairness that taxpayers will accept (except for the minority of soreheads whose children attend school A).

No comments:

Post a Comment