Next Contents Previous

6. X-RAY ABSORPTION VERSUS OPTICAL AND INFRARED ABSORPTION

Various studies have shown that X-ray and optical nuclear absorption do not match in AGNs. In particular, the measured optical dust extinction is systematically lower than inferred from the column density NH measured in the X-rays, assuming a Galactic gas-to-dust ratio [Maiolino et al. (2001a)]. An important consequence of this effect is the mismatch between optical and X-ray classification, and in particular the existence of type 1, broad line AGN with significant X-ray absorption [Wilkes et al. (2002), Hall et al. (2006), Szokoly et al. (2004), Barger et al. (2005), Fiore et al. (2003), Silverman et al. (2005)]. Extreme cases of this kind of mismatch are Broad Absorption Line (BAL) QSOs whose X-ray spectrum is heavily absorbed, and in some cases even Compton thick, although their optical spectrum shows little or no dust extinction [Gallagher et al. (2006), Braito et al. (2004), Maiolino et al. (2001c)].

For what concerns the origin the mismatch between X-ray and optical absorption, there are two possible physical reasons. The BLR is dust free, because inside the dust sublimation radius; therefore, if a large fraction of the X-ray absorbing column density is located within the BLR, as discussed in the previous section, then this naturally yields to a reduced AV / NH. Additionally, the circumnuclear dusty medium of AGNs is very dense (n ~ 105 cm-3) and in such dense environments dust grains tend to be larger, therefore being less effective in absorbing the optical and UV radiation [Maiolino et al. (2001b)].

Similar results have been obtained from the comparison between infrared and X-ray absorption. By using Spitzer mid-IR spectra, [Shi et al. (2006)] found that the depth of the dust silicate feature at ~ 9.7 µm correlates with the X-ray absorption, though with a large scatter. However, in most cases the silicate feature is much shallower than expected from the NH inferred from the X-rays by assuming a Galactic gas-to-dust ratio. Also the intensity and shape of the mid-IR continuum are little related by the presence of X-ray absorption along the line of sight. More specifically, [Silva et al. (2004)] found that the shape and intensity of nuclear mid-IR continuum of AGNs is essentially unchanged and independent of the column density NH measured in the X-rays (Fig. 3). Only for Compton thick sources the mid-IR SED appears reddened at wavelengths shorter than ~ 10 µm, but the inferred extinction is still more than one order of magnitude lower than expected from the NH inferred from the X-rays. Similar results were obtained by [Lutz et al. (2004)] and [Krabbe et al. (2001)]. The motivations for the reduced IR absorption relative to the X-ray absorption are partly the same as for the case of the optical absorption. However, an important additional factor contributing IR/X-ray absorption mismatch is that the mid-IR emission is extended, at least on the pc-scale, i.e. on a scale comparable or larger than the dense X-ray absorbing medium.

Figure 3

Figure 3. Nuclear, infrared spectral energy distribution observed in local AGNs, averaged in bins of absorbing NH inferred from the X-rays. The IR SEDs are normalized to the absorption-corrected, hard X-ray luminosity. There is little variation of the nuclear IR SED for different values of NH, except for Compton thick sources, which show a significant depression short-ward of 10 µm.

Next Contents Previous