What standardized measure defines a star’s true brightness by hypothetically placing it at a fixed distance of 10 parsecs?

Answer

Absolute Magnitude

Astronomers differentiate between the perceived brightness of a star and its intrinsic power output. While Apparent Magnitude measures how bright a star looks from Earth, factoring in its distance, Absolute Magnitude standardizes this measurement. Absolute Magnitude defines the apparent magnitude a star *would* possess if it were universally relocated to a fixed, standard distance of 10 parsecs (equivalent to $32.6$ light-years) from Earth. This standardization is crucial because it allows for a direct, distance-independent comparison of the total energy radiated by different stars.

What standardized measure defines a star’s true brightness by hypothetically placing it at a fixed distance of 10 parsecs?
astronomystarcelestial bodycharacteristic