Physics
/
Absolute Magnitude

Absolute Magnitude

Shiken premium Upgrade Banner

The science of stars has been studied for centuries. Back then, people relied only on their eyes to observe the stars. Nowadays, we use telescopes and other modern technology to help us. When we look up at the stars from Earth, we can tell which ones are brighter. This means they're emitting more radiation. However, we need to consider both the distance between us and the stars, and the possibility of other celestial bodies getting in the way. To measure the intensity of a star's emissions as we see them, we use a mathematical term called "apparent magnitude." This helps us understand how bright a star really is. And if you want to sound even smarter, you can also use the term "absolute magnitude" to compare the true brightness of stars.

Luminosity

Luminosity is the amount of electromagnetic radiation that a body emits in a certain amount of time. This includes all frequencies of the electromagnetic spectrum, not just the visible spectrum. Measuring luminosity can be tricky in astronomy because of two main factors.

First, the radiation emitted by a body spreads out and covers a large surface area. So, the radiation we receive at a specific point is only a small part of the total emitted. To measure luminosity, we need to know the distance between us and the emitting body to extrapolate the total amount of radiation emitted.

Second, space is not empty. Over large distances, like those between planets, stars, and galaxies, radiation can be absorbed by dust and gas clouds. This causes the intensity of the radiation to decrease, which is called extinction. Higher frequencies are affected more than lower ones. See Figure 1 for a helpful illustration.

The radiation spread over spherical surfaces

Luminosity is measured in watts (W) and, assuming that stars emit as black bodies, depends on the surface of the body and its temperature. To assume that stars emit as black bodies means that we consider the emission and absorption properties to be perfect and that there are no losses. This assumption turns out to be very precise for stars.

Apparent magnitude

As far back as the first century BCE, Hipparchus sorted stars based on their brightness in the sky. He used a scale from one to six, with one being the brightest and six being the dimmest. Then in 1865, scientists established that a star with a magnitude of one is 100 times brighter than a star with a magnitude of six. This discovery led to the use of a logarithmic scale to measure apparent brightness from EarthOne benefit of using a logarithmic scale is that once we determine a fixed point, like the star Vega, we can use it to define the rest of the scale. Vega is assigned an absolute magnitude of 0, meaning it's 2.512 times brighter than a magnitude one star.

The term "apparent" refers to the fact that we measure brightness from Earth, without worrying about extinction or radiation spread. We can simply compare the flux of radiation per unit of area on a logarithmic scale.

The formula for apparent magnitude, with Vega assigned a value of 0, is:

Here, m is the apparent magnitude, F is the flux of radiation received per unit of time and area, and FV is the flux of radiation received per unit of time for Vega.

Using a logarithmic scale means that magnitudes can be negative. For example, Sirius, the brightest star in our sky, has an apparent magnitude of -1.46, which means it's more than 2.512 times brighter than Vega.

The benefits of a logarithmic scale

A logarithmic scale for magnitudes helps to recover a linear behavior when dealing with phenomena that involve multiplicative effects that generate large numbers. This approach also allows us to work with relatively small numbers, making data more manageable.

To address the issue of extinction and the position of Earth in space, there is a useful measure of a star's intensity that involves using distances to the objects under study. However, this leads to the appearance of large numbers that can be tamed using a logarithmic scale.

A comparison between linear scale and logarithmic scale
A comparison between linear scale and logarithmic scale

Absolute magnitude

While apparent magnitude is a useful measure of the brightness of an astronomical object as seen from Earth, it is limited by subjectivity and does not provide much information about the object's actual properties. Absolute magnitude, on the other hand, is the apparent magnitude of an object when observed from a distance of 10 parsecs and is closely related to the object's luminosity. However, it does not take into account extinction factors, which can affect the accuracy of the measurement.

The formula for absolute magnitude is M = m - 5(log10(d/10)), where M is the absolute magnitude, m is the apparent magnitude, and d is the distance between the Earth and the object in parsecs.

For example, Sirius (the brightest star in our sky) has an apparent magnitude of -1.46, while Antares (an intermediate giant star with a huge luminosity, but much farther away from Earth) has an apparent magnitude of 1.09. However, their absolute magnitudes are 1.42 and -5.28, respectively, reflecting the much higher luminosity of Antares.

An illustration of the meaning of apparent magnitude (m) and absolute magnitude (M)
An illustration of the meaning of apparent magnitude (m) and absolute magnitude (M)

In summary, luminosity is the amount of electromagnetic radiation emitted by a body per unit of time. However, since measuring luminosity in space is difficult, astronomers use magnitudes as a logarithmic measure of luminosity. Apparent magnitude measures the flux density of luminosity as seen from Earth, while absolute magnitude aims to eliminate the dependence on distance to Earth. Both magnitudes have biases, such as the absence of extinction correction or the distance to Earth, but are useful quantities in astronomical studies.

Absolute Magnitude

What does absolute magnitude mean?

Absolute magnitude is a measure of the luminosity of a star on a logarithmic scale, and it is defined as the apparent magnitude of an object measured from 10 parsecs away.

How do we calculate the luminosity from the absolute magnitude?

After having chosen a certain reference value for our scale, we can just solve for the flux density of luminosity.

How do we calculate the absolute magnitude?

We have to take into account the distance of the object to the earth and know its apparent magnitude. Then, we apply the formula: M = m - 5·log_10(d) + 5.

What is the difference between absolute magnitude and apparent magnitude?

The difference between absolute and apparent magnitude is that apparent magnitude is defined as a measure from the earth, while absolute magnitude is defined as a measure from a distance of 10 parsecs of the object.

Why do we need the distance to compute the absolute magnitude?

Because we need to take into account the spreading of the electromagnetic intensity of radiation as the radiation propagates in space.

Join Shiken For FREE

Gumbo Study Buddy

Explore More Subject Explanations

Try Shiken Premium
for Free

14-day free trial. Cancel anytime.
Get Started
Join 20,000+ learners worldwide.
The first 14 days are on us
96% of learners report x2 faster learning
Free hands-on onboarding & support
Cancel Anytime