In a galaxy undergoing steady-state star formation, what relationship should ideally exist between $L_{ ext{UV}}$ and total $L_{ ext{IR}}$?
They should ideally match.
The principle of energy balance dictates how the energy generated by young stars should be accounted for across the electromagnetic spectrum in a system in equilibrium. If star formation is proceeding at a steady state, the total energy emitted directly by the young stars (captured primarily in the ultraviolet, $L_{ ext{UV}}$) must equal the total energy that has been absorbed by the surrounding dust and subsequently re-radiated at longer wavelengths (captured in the Infrared, $L_{ ext{IR}}$). If the UV luminosity is significantly less than the total IR luminosity ($L_{ ext{UV}} ext{ extless extless } L_{ ext{IR}}$), this discrepancy serves as a critical diagnostic, indicating that a substantial portion of the energy output from current star formation remains hidden because it is deeply embedded within thick layers of obscuring dust.
