How should orbital velocities behave far from a galactic core based on visible matter and Newtonian gravity?

Answer

Velocities should decrease as distance increases, specifically scaling according to $v \propto 1/\sqrt{R}$.

When considering only the luminous matter—the stars, dust, and gas—in a galaxy, the gravitational forces should dictate orbital speeds according to established Newtonian principles, analogous to planetary orbits around the Sun. Since most of this visible, or baryonic, mass is centrally concentrated in the bulge and inner disk, the gravitational influence weakens substantially further out. This relationship mandates that the orbital velocity ($v$) at a large radius ($R$) must decrease as the square root of the inverse radius, mathematically expressed as $v \propto 1/\sqrt{R}$. This expected drop-off is the standard Keplerian prediction for centrally concentrated mass distributions, which stands in direct contradiction to the observed flatness of galactic rotation curves.

How should orbital velocities behave far from a galactic core based on visible matter and Newtonian gravity?

#Videos

Dark Matter: The Math Behind Galaxy Rotation Curves - YouTube

galaxiesgravitydark matterrotationastrophysics