Entropy

From My Big TOE Wiki
Jump to: navigation, search

Entropy is a measure of disorder, or equivalently, a measure of the amount of energy available to "do work" in thermodynamics. Information sciences, computer sciences, and communication sciences have used the concept of entropy relative to information content versus randomness (signal to noise ratio). In MBT, the term is used to describe the level of informational organization within a system of consciousness. [MBT Forum 1]

Since MBT models consciousness as a digital information system, lower entropy equates to more information, more organization, more personal power, more love, less randomness, less disorder, less ego, and less fear. While the definition of entropy is typically divided between the usage in thermodynamics and information theory, there are similarities [External 1] and entropy in MBT subsumes both definitions.

The fundamental process of evolution drives consciousness to lower its entropy. If it did not, randomness and disorder would increase to the point of complete dissolution of consciousness. An analogy of this maximum entropy state would be a computer disk after it has been formatted.

PMR systems naturally dissipate their energy [MBT Trilogy 1], and this is known as the Second Law of Thermodynamics (it is part of the rule-set). Consciousness is based upon reality cells that exist within the Larger Consciousness System which encompasses, creates and sustains PMR. For an elucidating fictional story about the limits of entropy in PMR, a short story: "The Last Question" by Isaac Asimov is recommended. [External 2]

MBT Forum References

  1. Definition of Entropy

MBT Trilogy References

  1. Explanation of Entropy

External References

  1. Entropy in Thermodynamics and Information Theory
  2. The Last Question by Isaac Asimov © 1956 ("Can entropy ever be reversed?")