Wednesday, March 28, 2012

Entropy as a measure of class learning

I has just occurred to me that as teachers we work a piece of un-physical magic as we lead or guide a class through a course.
My reasoning goes as follows. At the start of the class, all the students are unknowns. We don't know how well they learn, how hard they are going to work, or what their ability is.
So if we define the entropy of the system in terms of all the possible outcomes, any student could potentially end up at any point in our class and the entropy is high.
in formula S = k Log(n)
where n is the number of possible states of the system, that is the number of different potential outcomes.
If I have 30 students in the class, and I mark out of 100, then this comes to
S = k 30 Log (100) = 60 k.
After the first mid term,  when 30% of the marks have been decided,
 S = k 30 Log (70) = 55.4 k.
After the second midterm with only 40% of the marks to be decided by the final,
S = k 30 Log (40) = 48.1 k.
After the final, when all outcomes are clear,
S = 0.
As the semester (term) goes on the probable trajectory of each student becomes more constrained. with the system eventually resolving into one final configuration, the final marks, and the uncertainty about the progress of any given student has reduced to zero.  In thermodynamic terms, we have cooled the system by removing the entropy and condensed the educational gas which the students represented, into a set of distinct phases, one for each possible student outcome (mark)..

We have reduced the entropy of the system Can I say cool?

No comments:

Post a Comment