How brief detector “glitches” can push inferred properties of merging stars and black holes off-track
Gravitational‑wave detectors often record short bursts of non‑Gaussian noise called “glitches.” This paper studies how such glitches can change the parameters scientists infer for short compact‑binary coalescences — mergers of black holes or neutron stars — when a glitch overlaps in time with the signal. The authors show that common glitch types can produce statistically significant biases in estimated mass, spin, and sky position.
To study the effect, the team used real detector data from LIGO’s third observing run (O3). They picked examples of three glitch classes known to cause trouble: “blip” glitches (brief, under a second), “thunder” glitches (a few seconds), and “fast‑scattering” glitches (minutes). Into those stretches of real strain data they injected many simulated compact‑binary signals, moving the signal time across the glitch to see how the inferred source properties changed. They ran standard Bayesian parameter estimation with the Bilby software, and compared the resulting probability distributions to baseline injections into glitch‑free data using a numerical “cost function” to quantify shifts.
The main finding is that glitches can move posterior distributions for many parameters away from their true values. Masses, spins and sky coordinates were among the most affected. In most of the cases they tested, nearly every parameter could suffer a significant bias if a glitch overlapped the signal. The study also finds that glitches occurring inside the signal’s time prior — roughly speaking, noise that overlaps the earlier part of the expected signal — tend to cause larger errors than glitches that fall outside that window. The authors use these results to estimate what time separation between signal and glitch can be treated as “safe” so that one can trust parameter estimates without performing explicit glitch subtraction.