How is Absolute Magnitude (M) standardized for comparison between stars?

Answer

By calculating brightness if placed at 10 parsecs.

Absolute magnitude, symbolized as 'M', serves as a standardized yardstick for determining a star's true, intrinsic brightness, stripping away the confounding variable of distance. To achieve this standardized measurement, astronomers employ a hypothetical scenario: they calculate what the star's apparent magnitude *would be* if that star were transported to a fixed, predetermined distance from Earth. This standard reference point is set universally at 10 parsecs, which is equivalent to approximately 32.6 light-years. By comparing stars based on how bright they would shine at this uniform separation, astronomers can directly compare the actual energy output and inherent luminosity of celestial objects, regardless of where they currently reside in the vast expanse of the galaxy, allowing for true categorization of stellar powerhouses.

How is Absolute Magnitude (M) standardized for comparison between stars?
astronomystarsbrightnessmagnitude