Redirected from Number 0
The number zero is the size of the empty set: if you don't have any apples, then you have zero apples. In fact, in certain axiomatic developments of mathematics from set theory, zero is defined to be the empty set.
The numeral or digit zero is used in positional number systems, where the position of a digit signifies its value, with successive positions having higher values, and the digit zero is used to skip a position. By about 300 BCE the Babylonians used two slanted wedges to mark an empty place in a given sequence of positional digits. It did not function in the true sense of a number. The use of zero as a number unto itself was introduced into mathematics relatively late, during the early 800s in Indian books.
In certain calendars it is common usage to omit the year zero when extending the calendar to years prior to its introduction: see proleptic Gregorian calendar and proleptic Julian calendar.
The following are some basic rules for dealing with the number zero. These rules apply for any complex number x, unless otherwise stated.
In an additive group, the identity element is called zero. In geometry, the dimension of a point is 0. A probability of 0 denotes no chance (which, however, in the case of a continuous random variable, does not mean impossible, see that article).
If your zero is centre-dotted and letter-O is not, or if letter-O looks almost rectangular but zero looks more like an American football stood on end (or the reverse), you're probably looking at a modern character display (though the dotted zero seems to have originated as an option on IBM 3270 controllers). If your zero is slashed but letter-O is not, you're probably looking at an old-style ASCII graphic set descended from the default typewheel[?] on the venerable ASR-33 Teletype[?] (Scandinavians, for whom Ø is a letter, curse this arrangement).
If letter-O has a slash across it and the zero does not, your display is tuned for a very old convention used at IBM and a few other early mainframe makers (Scandinavians curse this arrangement even more, because it means two of their letters collide). Some Burroughs/Unisys equipment displays a zero with a reversed slash. And yet another convention common on early line printers left zero unornamented but added a tail or hook to the letter-O so that it resembled an inverted Q or cursive capital letter-O.
The typeface used on European number plates for cars distinguish the two symbols by making the O rather egg-shaped and the zero more rectangular, but most of all by opening the zero on the upper right side, so here the circle is not closed any more.
Counting[?] from one or zero? Human beings usually count things from 1 not zero, yet in computer science, zero has become popular to use for indicating starting point. For example, in almost all old programming languages, an arrary starts from 1 by default[?], which is natural for human. As study of programming languages has developed, it has become more common that an array starts from zero by default. The reasons are not known well but probably can be:
Compare with the concept of a pole.
Zero means to erase; to discard all data from. This is often said of disks and directory[?], where "zeroing" need not involve actually writing zeroes throughout the area being zeroed. One may speak of something being "logically zeroed" rather than being "physically zeroed".
Many human cultures have created concepts that express zero-ness, such as:
Search Encyclopedia
|