Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Entropy is the distribution of potential over negative potential.

This could be said "the distribution of what ever may be over the surface area of where it may be."

This is erroneously taught in conventional information theory as "the number of configurations in a system" or the available information that has yet to be retrieved. Entropy includes the unforseen, and out of scope.

Entropy is merely the predisposition to flow from high to low pressure (potential). That is it. Information is a form of potential.

Philosophically what are entropy's guarantees?

- That there will always be a super-scope, which may interfere in ways unanticipated;

- everything decays the only mystery is when and how.



This answer is as confident as it's wrong and full of gibberish.

Entropy is not a "distribution”, it's a functional that maps a probability distribution to a scalar value, i.e. a single number.

It's the mean log-probability of a distribution.

It's an elementary statistical concept, independent of physical concepts like “pressure”, “potential”, and so on.


It sounds like log-probability is the manifold surface area.

Distribution of potential over negative potential. Negative potential is the "surface area", and available potential distributes itself "geometrically". All this is iterative obviously, some periodicity set by universal speed limit.

It really doesn't sound like you disagree with me.


Baez seems to use the definition you call erroneous: "It’s easy to wax poetic about entropy, but what is it? I claim it’s the amount of information we don’t know about a situation, which in principle we could learn."


> Entropy includes the unforseen, and out of scope.

Mmh, no it doesn't. You need to define your state space, otherwise it's an undefined quantity.


But it is possible to account for the unforseen (or out-of-vocabulary) by, for example, a Good-Turing estimate. This satisfies your demand for a fully defined state space while also being consistent with GP's definition.


You are referring to the conceptual device you believe bongs to you and your equations. Entropy creates attraction and repulsion, even causing working bias. We rely upon it for our system functions.

Undefined is uncertainty is entropic.


Entropy is a measure, it doesn't create anything. This is highly misleading.


> bongs

indeed


All definitions of entropy stem from one central, universal definition: Entropy is the amount of energy unable to be used for useful work. Or better put grammatically: entropy describes the effect that not all energy consumed can be used for work.


There's a good case to be made that the information-theoretic definition of entropy is the most fundamental one, and the version that shows up in physics is just that concept as applied to physics.


My favorite course I took as part of my physics degree was statistical mechanics. It leaned way closer to information theory than I would have expected going in, but in retrospect should have been obvious.

Unrelated: my favorite bit from any physics book is probably still the introduction of the first chapter of "States of Matter" by David Goodstein: "Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics."


That would mean that information-theory is not part of physics, right? So, Information Theory and Entropy, are part of metaphysics?


Well it's part of math, which physics is already based on.

Whereas metaphysics is, imo, "stuff that's made up and doesn't matter". Probably not the most standard take.


I'm wondering, isn't Information Theory as much part of physics as Thermodynamics is?


Would you say that Geometry is as much a part of physics as Optics is?


Not really. Information theory applies to anything probability applies to, including many situations that aren't "physics" per se. For instance it has a lot to do with algorithms and data as well. I think of it as being at the level of geometry and calculus.


Yeah, people seemingly misunderstand that the entropy applied to thermodynamics is simply an aggregate statistic that summarizes the complex state of the thermodynamic system as a single real number.

The fact that entropy always rises etc, has nothing to do with the statistical concept of entropy itself. It simply is an easier way to express the physics concept that individual atoms spread out their kinetic energy across a large volume.


I'm not sure that's quite the right perspective. It's not a coincidence that entropy increases over time; the increase in entropy seems to be very fundamental to the way physics goes. I prefer the interpretation "physics doesn't care what direction the arrow of time points, but we perceive it as pointing in the direction of increasing entropy". Although that's not totally satisfying either.


This definition is far from universal.


I think what you describe is the application of entropy in the thermodynamic setting, which doesn't apply to "all definitions".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: