Abstract Fast radio bursts show large dispersion measures, much larger than the Galactic dispersion measure foreground. Therefore, they evidently have an extragalactic origin. We investigate possible contributions to the dispersion measure from host galaxies. We simulate the spatial distribution of fast radio bursts and calculate the dispersion measures along the sightlines from fast radio bursts to the edge of host galaxies by using the scaled NE2001 model for thermal electron density distributions. We find that contributions to the dispersion measure of fast radio bursts from the host galaxy follow a skew Gaussian distribution. The peak and the width at half maximum of the dispersion measure distribution increase with the inclination angle of a spiral galaxy, to large values when the inclination angle is over 70°. The largest dispersion measure produced by an edge-on spiral galaxy can reach a few thousand pc cm−3, while the dispersion measures from dwarf galaxies and elliptical galaxies have a maximum of only a few tens of pc cm−3. Notice, however, that additional dispersion measures of tens to hundreds of pc cm−3 can be produced by high density clumps in host galaxies. Simulations that include dispersion measure contributions from the Large Magellanic Cloud and the Andromeda Galaxy are shown as examples to demonstrate how to extract the dispersion measure from the intergalactic medium.
Keywords galaxies: ISM — radio continuum: ISM — ISM: general
It accepts original submissions from all over the world and is internationally published and distributed by IOP