I came across luminosity for the first time in an astronomical context whilst reading the paper I'd downloaded about HD 140283 ( Howard E. Bond et al. Found via http://asterisk.apod.com/viewtopic.php?f=8&t=30787) and wonder why it was used rather than magnitudeIn astronomy, luminosity measures the total amount of energy emitted by a star or other astronomical object in SI units of joules per second, which are watts. A watt is one unit of power, and just as a light bulb is measured in watts, so too is the Sun, the latter having a total power output of 3.846×1026 W. It is this number which constitutes the basic metric used in astronomy and is known as 1 solar luminosity...
Radiant power, however, is not the only way to conceptualize brightness, so other metrics are also used. The most common is apparent magnitude, which is the perceived brightness of an object from an observer on Earth at visible wavelengths. Other metrics are absolute magnitude which is an object's intrinsic brightness at visible wavelengths, irrespective of distance, while bolometric magnitudeis the total power output across all wavelengths.
Could anyone give me some guidelines about when each of these metrics is used?
Many thanks
Margarita