Astronomers define two types of brightness when referring to the light being emitted from a star or other celestial object. The apparent brightness is how bright we perceive the star in relation to other stars. In the 2nd century BC the Greek astronomer, Hipparchus, categorized all naked eye stars into six brightness classes, numbered 1 through 6. A star of brightness 1 corresponds to the brightest visible and a star of brightness 6 is just discernible to the naked eye. Despite the subjectivity of this system it has remained in use to the present day. In terms of apparent brightness, the Sun would be roughly four trillion times brighter than an identical star 30 light years away.
A more objective measure of a
star's brightness is obtained when we correct for distance and define
the intrinsic brightness of a star. This is
done by allowing for the different distances of the stars from the earth.
This is possible
because we know exactly how the light from a given object is diluted
as it spreads out in space. When each star is assigned an absolute
brightness, a direct and fair comparison between different stars is
possible. Under this system of measurement the Sun has an absolute
brightness equivalent to 4.6 on Hipparchus' naked eye scale. This is
not particularly impressive in stellar terms but does constitute
4 x 10^26 Watts of energy continually bathing the earth. At this rate
of energy generation the Sun is expected to last for 10,000 million
years.