Sunday, July 23, 2006

I'm going

I'll be away from the internet all week so I thought I would add some links.

Molly's back.
Some stories are just plain fucked up
yes there are some stupid physists

I'm out.

Sunday, July 09, 2006

Physics topics for beginners part 1: Entropy

I've decided that lots of people don't really understand some commonly used concepts in physics, and I should try to spread information about their true meanings. The two I'll do first are entropy and energy, because those are the two most misused.

I'm somewhat sympathetic to those who don't understand entropy because it is a pretty strange concept. It's usually described as the disorder in a system, and although that is true, that can be misleading unless you know what entropy really is. Entropy, most fundamentally, can be said to be a measure of the homogeneity of the system. In everyday language that means how mixed together things are: the entropy is at maximum when the system is totally mixed together and is identical everywhere. How do I come to this conclusion? For this we need to use a little simplified quantum mechanics.

Let's say you have a switch: it can be either on or off. If you have two switches they can be either all on, all off, or one on and one off. Let us say that we can't tell the switches apart, then any combination of one on and one off is identical. So there are two ways of getting one on and one off, but only one way of getting both on or both off. As you keep adding switches there are more and more ways of arranging things so that they all look the same. The entropy of one of these combinations is defined as being proportional to (actually the logarithm of) the number of ways of arranging it without changing how it looks.

From that it is easy to see that the highest entropy will be when exactly half of the switches are on and half are off. The second law of thermodynamics (entropy of a closed system never decreases) is easy to see as probabilities: the most likely combination is the highest entropy one, so with large numbers of switches you are extremely unlikely to go from higher to lower entropy.

Now how does this discussion of switches relate to reality? In quantum mechanics things (for example atoms or molecules) have distinct states, some times two or three (or another whole number), but usually an infinite amount, making the math much more difficult, but the parallels to the switches example are obvious.

If you have any questions leave comments.

ps. The reason I did the entropy one first is that creationists are always misusing the second law of thermodynamics to try to refute evolution. The part the forget of course is the "closed system" part (hmm... could there be a massive heat source not too far from the earth?), but their objections are based on a misunderstanding of what entropy is. Life creates entropy, usually by changing chemical energy to heat energy when we metabolize food (this creates entropy because there are more ways of arranging heat energy without changing the total amount then with chemical energy). Entropy has nothing to do with complexity of structure (until you get to really high levels of entropy: at the so called heat death of the universe).

Anyways I don't expect this to stop most creationists from using the second law of thermodynamics as an argument, but at least it gives reasonable people the knowledge to call bullshit on bullshit.