not the country or the triangle :)

  • 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle




  • In my recent research, published in AIP Advances, I used information theory to propose a new law of physics, which I call the second law of infodynamics. And importantly, it appears to support the simulated universe theory.

    At the heart of the second law of infodynamics is the concept of entropy—a measure of disorder, which always rises over time in an isolated system. When a hot cup of coffee is left on the table, after a while it will achieve equilibrium, having the same temperature with the environment. The entropy of the system is at maximum at this point, and its energy is minimum.

    The second law of infodynamics states that the “information entropy” (the average amount of information conveyed by an event), must remain constant or decrease over time—up to a minimum value at equilibrium.

    Okay maybe I’m just not smart enough to get this, and I think that’s probably the case, but isn’t this more in the realm of social science? Is information quantifiable in physics? I know you can quantify data, but information?

    I’m not doubting it but I’m just… confused. I don’t feel like article really explained what “information” means in this instance. They said “genetic information,” but that’s also not measured by the term information. I just need like a really dumbed down guide for this…

    Also this article just feels like an ad for the author’s study :/