catbarf on 28/9/2007 at 01:08
I can see what you're saying, RocketMan- basically you're saying it's entropy that causes probability to follow a norm. And you can't test it, since entropy would be modifying the results accordingly.
I'm not certain one way or the other, but I'm really not qualified to continue with the discussion. I do think it is an interesting theory.
RocketMan on 28/9/2007 at 02:15
that's still all it is sadly...in fact i can't even call it a theory cuz its not based on any work or anything...its more of a musing....but I'm compelled to believe it. If i believed in a god, it would probably be Lord Entropy :P
Yellow on 28/9/2007 at 06:36
I'm not sure I understand how probability is meant to be reliant on entropy. Certainly the converse is true, with pretty much everything in thermodynamics and statistical mechanics being probability based.
Ko0K on 28/9/2007 at 06:53
Quote Posted by D'Juhn Keep
He got laid
That's what happened to my gaming buddies. :mad: Ah, well. What can I say, except I hope they're happy.
RocketMan on 28/9/2007 at 14:55
I think physics deserves priority because that's the part that's real. Math is simply a language to describe it but has no physical embodiment that can be measured with our senses. Therefore I feel math has to rely on physics not the other way around. If entropy relied on probability, then it would be as if we discovered probability one day and then created entropy to satisfy it.
frozenman on 28/9/2007 at 14:55
I absolutely despised my thermodynamics class- easily the most painful experience I've ever had to endure. Never the less this is what I remember.
The entropy of a system is related to the number of accessible states the system can occupy. S = k ln(Q) where Q is the number of accessible non-degenerate microstates. In a case like a small volume of gas with N particles, each occupying its own energy level, Q is the number of different energy distributions that will yield a unique total energy.
For a system like a coin the degeneracy is absurd- it doesn't matter what configuration the atomic structure of the coin is doing, how the coin is tossed or what-have-you, the system can still only occupy one of two states. However this assumes one flip at a time- if you had N coins flipped simultaneously it might be a bit more interesting for study (just about any thermo class will use a two-level system (spin, for example) as an example.
RocketMan on 28/9/2007 at 21:37
Quote Posted by frozenman
I absolutely despised my thermodynamics class- easily the most painful experience I've ever had to endure. Never the less this is what I remember.
The entropy of a system is related to the number of accessible states the system can occupy. S = k ln(Q) where Q is the number of accessible non-degenerate microstates. In a case like a small volume of gas with N particles, each occupying its own energy level, Q is the number of different energy distributions that will yield a unique total energy.
For a system like a coin the degeneracy is absurd- it doesn't matter what configuration the atomic structure of the coin is doing, how the coin is tossed or what-have-you, the system can still only occupy one of two states. However this assumes one flip at a time- if you had N coins flipped simultaneously it might be a bit more interesting for study (just about any thermo class will use a two-level system (spin, for example) as an example.
I think you know quite a bit more about thermo than I do but to be clear on what I meant, I didn't mean that the number of states is affected by the way the coin is flipped. I meant that the outcome is affected by it. If the coin were flipped identically every time, it should land the same every time as well. The point of my comment was that even a human, apparently capable of free will, cannot tamper with a probabilistic experiment. He is just another variable in the outcome and thus is governed by entropy. And if that is so, then the experiment and its outcomes must also be entropy-driven.
dvrabel on 28/9/2007 at 23:38
What does entropy and thermodynamics have to do with coin flipping? Hint: nothing -- the outcome of a set of coin flipping trials is not a thermodynamic system.
To quote Wikipedia's article on (
http://en.wikipedia.org/wiki/Entropy) entropy:
"In physics, entropy, symbolized by S, from the Greek μετατροπή (metatropi) meaning "transformation", is a measure of the unavailability of a system’s energy to do work."
Chade on 29/9/2007 at 14:29
RocketMan, Entropy is just an unavoidable consequence of our fundamental laws of nature. It is not this all-encompassing mysterious force you portray it as ...
There is certainly no need to use Entropy to explain the outcome of a number of coin tosses.
RocketMan on 29/9/2007 at 17:07
Quote Posted by dvrabel
What does entropy and thermodynamics have to do with coin flipping? Hint: nothing -- the outcome of a set of coin flipping trials is not a thermodynamic system.
Then you obviously know much less than the rest of us about thermodynamics. Thermodynamics is paramount to just about everything. Talk to a prof at a university or someone with an equivalent background in the subject and they'll tell you the same.
Chade:
You're right, its not a magic force or anything. I suppose the overtone of my posts gave that impression. It seems mysterious to me though. I suppose that's because when discussing the nature of entropy one invariably gets into philosophical territory, considering things like fate, etc. Entropy is not a cognitive entity by any means but even when not discussing coins, it appears to take control over many systems and give them a homogeneity and tends to chaos (by definition of course). To me that seems kind of spooky, like we still don't quite understand why it happens.
Perhaps the best way to express my regards for entropy is to compare it to quantum physics. Quantum physics works very well and has great promise but nobody knows why the hell quantum theory behaves the way it does. It too is spooky.
Why is it that as you consecutively flip heads the probability of such occurences diminishes exponentially? Its not a question I expect anybody to answer "because of this law stupid". Its just a question I feel we haven't fully answered yet.
One more thing....entropy is something introduced when studying thermodynamics because it has direct implications for systems involving energy and heat and how they change over time. However it is not confined to heat problems alone. Both entropy and thermodynamics encompass far more than their names seem to suggest. Coins, air molecules, paths in a maze, subatomic particle energies, errors in computer hard drives, the list goes on....all are quantifyable within the framework of thermodynamics.