Encyclopedia > Physical information

  Article Content

Physical information

Physical information refers generally to the information that is contained in a physical system. First, what is information?

Table of contents

Information

Information itself may be loosely defined as "that which distinguishes one thing from another." The information that is contained in a thing can thus be said to be the identity of the particular thing, itself, that is, all of its properties, all that makes it distinct from other (real or potential) things.

Classical vs. Quantum Information

For physical systems, we must distinguish between classical information and quantum information. Quantum information specifies the complete quantum state vector (equivalently, wavefunction) of a system, whereas classical information, roughly speaking, only picks out a quantum state if one is already given a prespecified set of distinguishable (orthogonal) quantum states to choose from; such a set forms a basis for the vector space of all possible (pure) quantum states. Quantum information can thus be considered to consist of (1) a choice of basis such that the actual quantum state is equal to one of the basis vectors, plus (2) the classical information specifying which of these basis vectors is the actual one.

Note that the amount of classical information in a quantum system gives the maximum amount of information that can actually be measured and extracted from that quantum system for use by external classical (decoherent) systems, since only basis states are operationally distinguishable from each other. The impossibility of differentiating between non-orthogonal states is a fundamental principle of quantum mechanics, equivalent to Heisenberg's uncertainty principle. Because of its more general utility, the remainder of this article will deal primarily with classical information, although quantum information theory does also have some potential applications (quantum computing, quantum cryptography, quantum teleportation) that are currently being actively explored by both theoreticians and experimentalists [1].

Classical Information

An amount of (classical) information may be quantified as follows [2]. For a system S, defined abstractly in such a way that it has N distinguishable states (orthogonal quantum states) that are consistent with its description, the amount of information I(S) contained in the system's state can be said to be log(N). The logarithm is selected for this definition since it has the advantage that this measure of information content is additive when concatenating independent, unrelated subsystems; e.g., if subsystem A has I(A)=N distinguishable states (log(N) information content) and an independent subsystem B has I(B)=M distinguishable states (log(M) information content), then the concatenated system has NM distinguishable states and an information content I(AB) = log(NM) = log(N) + log(M) = I(A) + I(B). We expect information to be additive from our everyday associations with the meaning of the word, e.g., that two pages of a book can contain twice as much information as one page.

The base of the logarithm used in this definition is arbitrary, since it affects the result by only a multiplicative constant, which determines the unit of information that is implied. If the log is taken base 2, the unit of information is the binary digit or bit (so named by John Tukey); if we use a natural logarithm instead, we might call the resulting unit the "nat." In magnitude, a nat is apparently identical to Boltzmann's constant k or the ideal gas constant R, although these particular quantities are usually reserved to measure physical information that happens to be entropy, and that are expressed in physical units such as Joules per Kelvin, or kilocalories per mole per Kelvin.

Physical Information and Entropy

An easy way to understand physical entropy itself is as follows: Entropy is simply that part of the (classical) physical information contained in a system whose identity (as opposed to amount) is unknown. This informal characterization fits von Neumann's formal definition of the entropy of a mixed quantum state, as well as Shannon's definition of the entropy of a probability distribution over classical states [2].

Even when the exact state of a system is known, we can say that the information in the system is still effectively entropy if that information is effectively incompressible, that is, if there are no known or feasibly determinable correlations or redundancies between different pieces of information within the system. Note that this definition can be viewed as equivalent to the previous one (unknown information) if we take a meta-perspective and say that for observer A to know the state of system B means simply that there is a definite correlation between the state of observer A and the state of system B; this correlation could be used by a meta-observer to compress his description of the joint system AB [3].

References

  1. Michael A. Nielsen and Isaac L. Chuang, Quantum Computation and Quantum Information, Cambridge University Press, 2000.
  2. Michael P. Frank, "Physical Limits of Computing", Computing in Science and Engineering, 4(3):16-25, May/June 2002. http://www.cise.ufl.edu/research/revcomp/physlim/plpaper.
  3. W. H. Zurek, "Algorithmic randomness, physical entropy, measurements, and the demon of choice," in [4], pp. 393-410, and reprinted in [5], pp. 264-281.
  4. J. G. Hey, ed., Feynman and Computation: Exploring the Limits of Computers, Perseus, 1999.
  5. Harvey S. Leff and Andrew F. Rex, Maxwell's Demon 2: Entropy, Classical and Quantum Information, Computing, Institute of Physics Publishing, 2003.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
1904

... and actor July 12 - Deng Xiaoping, Chinese politician and leader July 13 - Pablo Neruda, poet August 28 - Secondo Campini, Italian jet pioneer November 30 - ...

 
 
 
This page was created in 26.2 ms