The second law of thermodynamics is as fundamental as the uncertainty principle: The former is a result from markov chains and information theory, the latter is a result from fourier analysis of conjugate variables.
What would you consider a "fundamental physical law"?
Would you mind sharing a link to a proof of the second law using Markov chains and information theory? I'd like to learn more about this approach.
I'm surprised that Markov chains would be involved when the laws of physics are deterministic.
The Poincare recurrence theorem has always suggested to me that the second law is not as fundamental as other laws. For a finite system with finite phase space, the state of a system will traverse closed loops, repeating forever with no steady increase or decrease in entropy. (Edit: to be clear, I'm not claiming that what I just described is the Poincare recurrence theorem or that it applies to our universe. But it is worth considering systems where the second law doesn't apply and trying to figure out how and if they differ critically from reality.)
Not that my background is worth anything, but just so you know where I'm coming from, I have a PhD in physics, spent years thinking about the entropy of computation, and wrote parts of the Wikipedia entry on Maxwell's demon. I think much of the disagreement over entropy and the second law comes from how we frame the problem.
I am on mobile now, and can't provide a simple link, but it is given in Cover&Thomas "Elements of Information Theory", in the episode that discusses entropy of markov processes. I can find pages in google books but they won't zoom big enough to read...
IIRC, the proof requires the markov chain be irreducible, and extends to the general case by summing over the irreducible parts; and that entropy will stay the same or increase while converging to the stationary distribution over states.
(Although it has now been 20 years since I dealt with these things so I might be misremembering. Time to retread Cover&Thomas I guess...)
Cover & Thomas, 2nd Edition, Jul 2006, pg 81, section 4.4 - entropy rate of markov processes, I did not remember all the conditions needed for this to hold, please read if you are interesting.
[0] staff.ustc.edu.cn/~cgong821 /Wiley.Interscience.Elements.of.Information.Theory.Jul.2006.eBook-DDU.pdf seems to have a copy indexed by Google. I suspect it is not legitimate
I believe a distinction could be made around the second law of thermodynamics being emergent rather than, well, fundamental.
Put another way: the second law of thermodynamics is not required to fully describe a universe that acts like ours appears to. It will "take care of itself" based on more fundamental descriptions of matter and its interactions.
I'm also comfortable with what appears to be fundamental today no longer being fundamental tomorrow.
Especially as the subject is quantum physics and thermodynamics, this is a very interesting viewpoint. Though I agree with you in principle, many centuries of inquiry about our universe by some of the most intelligent humans to date seems to say that mathematical formulations and ideas are the best (if not only, as with QM) way to think about our universe. No other way has really been as incredibly useful in terms of prediction as a mathematical one. I think you do have a point, but I will continue to take the so-called Copenhagen Interpretation of QM: Shut Up and Calculate.
The uncertainty principle is an absolute and inviolable consequence of pure mathematics; the second law of thermodynamics just makes predictions that are "very very overwhelmingly likely". The probability of the application of the uncertainty principle producing an incorrect prediction (internal to the theory) is zero, whereas the probability of the application of the second law producing an incorrect prediction (again internal to the theory) is non-zero. There is an infinity of difference between the internal "absoluteness" of these theories.
The probability of either theory producing an incorrect prediction external to the theory is much higher, and will be about the same for each theory. That is, it's more likely we're wrong about all of physics than that the second law makes a bad prediction in a bulk system.
The statement of the uncertainty principle that I am aware of is that the product of standard deviations of conjugate variables (e.g., time and frequency; position and velocity) is bounded from below. This is a statement about the sample space. Note that standard deviation is expectation (over the ensemble) of the 2nd moment.
I did not remember the exact statement when I posted earlier, but here it is: the statement of the (markov) 2nd law of thermodynamics is also a statement about the sample space: It says that in expectation (over the ensemble) the relative entropy decreases towards that of stationary distribution (which in most systems is the highest entropy distribution possible in that system, thus absolute entropy is non decreasing). That is, unless the system starts at a state with a higher entropy than that of the stationary distribution, entropy will not decrease.
Both are mathematical statements, consequences of pure mathematics, neither of which gives a prediction - they both give ensemble averages, and both are both internally perfectly correct and consistent. See Cover & Thomas, 2nd Edition, Jul 2006, pg 81, section 4.4 - entropy rate of markov processes. Unless, of course, you are referring to a different version of the uncertainty principle which I am not familiar with.
No, the uncertainty principle is a statement about the behavior of non-commuting operators in a Hilbert space. It is not a probabilistic statement. It doesn't even have anything to do with probability until you apply it to a probabilistic interpretation of quantum mechanics, where vectors in the Hilbert space have something to do with probability. The understanding you are referring to is more or less correct, but is a specific application to certain interpretations of quantum mechanics. It's also useful to think about it this way experimentally (in terms of "uncertainty"), which is why most people learn it this way in early physics classes and where the name comes form.
Perhaps I have made the same mistake as you, and was thinking about the practical but non-generalized way the second law of thermodynamics is usually taught, which includes concepts like "the system will be in this state". The only part of your statement that is still probabilistic is "entropy will not decrease". That's not really true; it probably won't decrease.
>neither of which gives a prediction
"entropy will not decrease" is a prediction. It is possible (albeit overwhelmingly unlikely over large time scales) that this prediction is sometimes false.
> The only part of your statement that is still probabilistic is "entropy will not decrease"
The markov/IT version of the 2nd law is a statement about macrostate entropy (taking the entire microstate ensemble for each macrostate), and in that sense it is not probabilistic. See the Cover&Thomas reference I gave earlier for the exact definition. It is indeed different than how it is usually taught in physics, in which the microstates are differentiated.
I guess we both need to be more careful about mathematical definitions in the future ...
> The second law of thermodynamics is as fundamental as the uncertainty principle: The former is a result from markov chains and information theory, the latter is a result from fourier analysis of conjugate variables.
I don't think enumerating the pedigree of a law gives it the seal of approval of validity. Newtonian physics also has pretty solid pedigree, and yet breaks down in relativistic conditions.
Please note I do not disagree with your conclusion, only with the way you present it.
Any law that's not based on a stochastic process, since stochastic processes can be gamed with a little bit of cleverness. If you don't believe me, just look at HFT.
You may want to read 'Thermal Physics' by Kittel and Kromer. It is the BIBLE of statistical mechanics and should straighten you out on why a statistical law is just as good as all others. And yes, it goes into great detail about when and how the statistical laws 'emerge', so to speak. Physicists have been 'worried' about what you mention in you comment for about a century and the science is very mature these days. Used copies start at ~$25.
What would you consider a "fundamental physical law"?