The SN rate within a typical local starburst galaxy is ~ 0.05 yr-1 (Mattila & Meikle 2001), which implies a total number of SNe exceeding ~ 2 × 106 over the lifetime of a starburst event. How much of the kinetic energy from all these SNe is available to drive the wind? This is of particular interest with respect to the possible role SN-driven winds play in imparting additional heating to the IGM and ICM (see various discussions in this volume).
It is useful to split the problem into two parts, firstly radiative losses within the starburst region, and secondly radiative losses within the larger-scale superwind. The second of these can be assessed relatively straightforwardly with observations of local superwinds, and will be addressed in Section 6. The radiative losses young SNRs suffer within the starburst region are extremely difficult to determine observationally, as these regions are heavily obscured. Consequently arguments about radiative energy losses are based purely on theory and a wide range of opinions exist. I shall present the situation as I see it, and refer the reader to the contribution by Recchi (this volume) for a different point of view.
A single isolated SNR, evolving in a uniform medium of number density
~ 1 cm-3 will lose a fraction f ~ 90% of
its initial kinetic energy
to radiation over ~ 4 × 105 years
(Thornton et
al. 1998).
Adopting the terminology of
Chevalier & Clegg
(1985),
this corresponds to a thermalization efficiency of
therm =
1 - f ~ 10%. Cooling depends sensitively on the local density as
n2
dV.
Many authors assume that because bursts of star formation occur
in regions with large amounts of dense gas, virtually all
the energy from SNe is lost due to radiation (e.g.
Steinmetz 1999).
This ignores the multiphase nature of the ISM where the filling
factor of dense gas is low, and that the
phase structure is determined by the local SN rate
(Rosen & Bregman
1995).
In a starburst region such as at the center of M82 or
NGC 253
the SN rate per unit volume is a few
× 10-9 yr-1 pc-3,
about 5 orders of magnitude higher than the SN rate/volume in the disk
of the MW (~ 4 × 10-14 yr-1 pc-3,
Slavin & Cox
[1993]).
The average individual SNR or
wind blown bubble in a starburst
does not exist long enough to radiate away 90%
of its energy before it runs into another remnant or pre-blown
low density cavity. Once in a low density medium radiative losses
cease to be significant (see
Mac Low & McCray
1988).
As a consequence the thermalization
efficiency in starbursts must
be considerably higher than the 10%
value applicable to "normal" star-forming disks.
Numerical simulations investigating thermalization efficiency as a function
of SN rate/volume support this argument (Strickland, in preparation).
Some SNe may occur molecular cores, and suffer significant
radiative losses, but on average SNe in the starburst do not lose
a large fraction of their energy. Thermalization efficiencies
therm
50% are quite possible.
In principle, observationally measuring the temperature of the very hot
tenuous gas in the starburst region (i.e. the thermalized
SN ejecta) can directly provide the thermalization efficiency.
Using the rates of mass and energy input from the Starburst99 models
(Leitherer et
al. 1999),
Tgas = 1.2 × 108
therm
-1 K, where
1 is a measure of mass-loading
(Suchkov et al. 1996).
The faint X-ray emission from this hot gas can, in principle,
be detected in nearby starburst galaxies. Unfortunately starburst
regions are also host to large numbers of
of X-ray binaries, and possible low luminosity AGN,
making this measurement extremely difficult.
The first believable detection of this very hot gas
uses Chandra's high spatial resolution to resolve out the
X-ray binaries.
Griffiths et al. (2000)
claim to detect diffuse emission from a
T ~ 5 × 107 K gas within M82's starburst region,
which if confirmed implies
therm
~ 40
+1%.