
Information itself may be loosely defined as "that which distinguishes one thing from another." The information that is contained in a thing can thus be said to be the identity of the particular thing, itself, that is, all of its properties, all that makes it distinct from other (real or potential) things.
Classical vs. Quantum Information
For physical systems, we must distinguish between classical information and quantum information. Quantum information specifies the complete quantum state vector (equivalently, wavefunction) of a system, whereas classical information, roughly speaking, only picks out a quantum state if one is already given a prespecified set of distinguishable (orthogonal) quantum states to choose from; such a set forms a basis for the vector space of all possible (pure) quantum states. Quantum information can thus be considered to consist of (1) a choice of basis such that the actual quantum state is equal to one of the basis vectors, plus (2) the classical information specifying which of these basis vectors is the actual one.
Note that the amount of classical information in a quantum system gives the maximum amount of information that can actually be measured and extracted from that quantum system for use by external classical (decoherent) systems, since only basis states are operationally distinguishable from each other. The impossibility of differentiating between nonorthogonal states is a fundamental principle of quantum mechanics, equivalent to Heisenberg's uncertainty principle. Because of its more general utility, the remainder of this article will deal primarily with classical information, although quantum information theory does also have some potential applications (quantum computing, quantum cryptography, quantum teleportation) that are currently being actively explored by both theoreticians and experimentalists [1].
An amount of (classical) information may be quantified as follows [2]. For a system S, defined abstractly in such a way that it has N distinguishable states (orthogonal quantum states) that are consistent with its description, the amount of information I(S) contained in the system's state can be said to be log(N). The logarithm is selected for this definition since it has the advantage that this measure of information content is additive when concatenating independent, unrelated subsystems; e.g., if subsystem A has I(A)=N distinguishable states (log(N) information content) and an independent subsystem B has I(B)=M distinguishable states (log(M) information content), then the concatenated system has NM distinguishable states and an information content I(AB) = log(NM) = log(N) + log(M) = I(A) + I(B). We expect information to be additive from our everyday associations with the meaning of the word, e.g., that two pages of a book can contain twice as much information as one page.
The base of the logarithm used in this definition is arbitrary, since it affects the result by only a multiplicative constant, which determines the unit of information that is implied. If the log is taken base 2, the unit of information is the binary digit or bit (so named by John Tukey); if we use a natural logarithm instead, we might call the resulting unit the "nat." In magnitude, a nat is apparently identical to Boltzmann's constant k or the ideal gas constant R, although these particular quantities are usually reserved to measure physical information that happens to be entropy, and that are expressed in physical units such as Joules per Kelvin, or kilocalories per mole per Kelvin.
Physical Information and Entropy
An easy way to understand physical entropy itself is as follows: Entropy is simply that part of the (classical) physical information contained in a system whose identity (as opposed to amount) is unknown. This informal characterization fits von Neumann's formal definition of the entropy of a mixed quantum state, as well as Shannon's definition of the entropy of a probability distribution over classical states [2].
Even when the exact state of a system is known, we can say that the information in the system is still effectively entropy if that information is effectively incompressible, that is, if there are no known or feasibly determinable correlations or redundancies between different pieces of information within the system. Note that this definition can be viewed as equivalent to the previous one (unknown information) if we take a metaperspective and say that for observer A to know the state of system B means simply that there is a definite correlation between the state of observer A and the state of system B; this correlation could be used by a metaobserver to compress his description of the joint system AB [3].
Search Encyclopedia

Featured Article
