How do distance errors grow as the target star moves farther away when using parallax?

Answer

Distance errors grow exponentially relative to the distance

Stellar parallax determines distance based on a fixed geometric relationship involving the diameter of Earth's orbit (the baseline). Since the measured parallax angle is inversely proportional to the object's distance, errors in measurement become disproportionately large as the object recedes. If an object is twice as far away, its parallax angle is halved. Therefore, the same instrumental noise (e.g., 10 micro-arcseconds uncertainty) results in a much larger *relative* error in the final distance calculation for distant objects compared to nearby stars, severely limiting the direct utility of optical parallax for intergalactic distances.

How do distance errors grow as the target star moves farther away when using parallax?

#Videos

What Are The Limitations Of Astrometric Exoplanet Searches?

measurementstarlimitationastrometryprecision