Redirected from Visual magnitude
App. Mag. | Celestial Object |
---|---|
-26.8 | Sun |
-12.6 | full Moon |
-4.4 | Maximum brightness of Venus |
-2.8 | Maximum brightness of Mars |
-1.5 | Brightest star: Sirius |
-0.7 | Second brightest star: Canopus |
+6.0 | Faintest stars observable with naked eye |
+12.6 | Brightest quasar |
+30 | Faintest objects observable with Hubble Space Telescope |
(see also List of brightest stars) |
The scale on which magnitude is measured is a somewhat strange one. It has its roots in the tradition of dividing those stars visible to the naked eye into six magnitudes. The brightest stars are said to be of first magnitude, the next brightest are of second magnitude, and so on down to sixth magnitude, the limit of naked eye visibility. This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to have originated with Hipparchus.
In 1856, Norman R. Pogson[?] noticed that the traditional system could be approximated by assuming that a difference of one magnitude corresponds to a brightness ratio equal to the fifth root of 100, so that a typical first magnitude star is 100 times brighter than a typical sixth magnitude star. The fifth root of 100 used in this scale is known as Pogson's Ratio, and is approximately equal to 2.512. Pogson's scale was originally fixed by assigning Polaris a magnitude of exactly 2. Astronomers have since discovered that Polaris is slightly variable so now Vega is the standard reference star, but the principle remains the same.
The apparent magnitude in the band x can be defined as
where Fx is the observed flux in the band x, and C is a constant that depends in the units of the flux and the band.
The first thing to notice about this scale is that higher numbers correspond to dimmer objects. Really bright objects have negative magnitudes. For example, Sirius, the brightest star in the night sky, has an apparent magnitude of -1.46.
The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... The logarithmic nature of the scale is due to the fact of the human eye itself having a logarithmic response.
Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way in which it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured in order for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range). The V band was chosen so that it gives magnitudes closely corresponding to those seen by the human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant, also called visual magnitude.
Since cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV reaches of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars would have a UBV magnitude of well over 100 since they emit extremely little visible light, but are strongest in infra-red.
Magnitude is a minefield and it is extremely important to measure like with like. On photographic film, the relative brightnesses of the blue supergiant Rigel and the red supergiant Betelgeuse are reversed compared to what our eyes see since film is more sensitive to blue light than it is to red light.
For an object with given absolute magnitude, 5 is added to the relative magnitude when the distance is multiplied by 10.
Search Encyclopedia
|
Featured Article
|