What is the relationship between apparent magnitude and radiative flux?

Given the radiation flux from a source, you should be able to calculate the source's apparent magnitude, for the same spectral band. And likewise conversely. What is the equation for doing that?

2020-09-20T18:08:12Z

m = −2.5 log F − 18.98224

check

The bolometric solar flux at 1 AU,

F = 1360.81 Wm⁻²

So the bolometric apparent magnitude of the sun at a distance of 1 AU is

m = −26.81673

Looks right to me!

david2020-09-16T17:08:19Z

Favorite Answer

Good question. Back to the text books for me. 

Anonymous2020-09-16T21:14:17Z

RF is an alternative fuel source for Flux Capacitor.

?2020-09-16T19:46:36Z

Well, there are several quantities here.

There is the bolometric luminosity of the source --- the power of all the radiation emitted from the source in all directions over all wavelengths, measured in Watts.

Then there is the luminosity of the source at a particular wavelength --- the power of the radiation emitted from the source in all directions over a narrow band of wavelengths (or frequencies), measured in Watts per Angstrom or Watts per Hertz.
Call that "L" .   The integral of L over all frequencies (or wavelengths) is the bolometric luminosity.  The "luminosity in a band of frequencies (or wavelengths)" is the average value of the luminosity within that band.  It's an average, so the units do not change since it's an integral over a range of wavelengths divided by the width in wavelengths (or Hertz) --- it's still Watts per Angstrom or Watts per Hertz.

If you want to divide up the source into unit areas, instead of considering it as a whole, then you're talking "flux density".

Now, consider what a distant observer sees --- that's the brightness, and it depends on how far away the observer is, since the radiation emitted falls off by the inverse-square law.  All of the luminosity of the source flows through the surface of a sphere whose radius is the distance between the source and the observer.  The observer captures a fraction of that luminosity that is proportional to the collecting area used by the observer, compared to the surface area of that sphere.  The "brightness" assumes a standard collecting area, say, a square meter.  So the brightness is measured in Watts per Hertz per square meter, the amount of the luminosity collected by the observer at a distance D from the source with a "unit" collecting area.  

B = L / (4 pi D^2)   measured in Watts per Hertz per square meter, or since this tends to be a small number, consider the "Jansky", which is 10^-26 Watt per Hertz per square meter, the same unit only 10^26 times smaller.  The "brightness" is also called the "spectral flux density".

Now, the "apparent magnitude" (in the "AB" system) is simply the logarithm of the brightness.   If F is the brightness (or "spectral flux density" measured in Janskys (B * 10^26)), then

Mab = -2.5 log(F/[1 Jy]) + 8.9  is the apparent magnitude of the source for the observer.

The "absolute magnitude" is simply what the apparent magnitude would be, if the distance to the source were 10 parsecs.

All the above works for "point sources" that are too small for the observer to resolve.  If the object is resolved (a galaxy, for example), then the photometry measures the brightness per unit area of sky --- "Janskys per square arcsecond" or "magnitude per square arcsecond".   As a practical matter, that is determined by measuring the detector response to calibrator sources that are points, and then comparing that to the response of a square arcsecond's worth of pixels.

--- voting for Biden