The Advanced Laser Interferometer Gravitational-wave Observatory (LIGO) and Virgo detectors have directly observed gravitational waves from binary systems of neutron stars and black holes in recent years.
The time series of dimensionless strain, defined by the differential changes in the length of the two orthogonal arms divided by the averaged full arm length, is used to determine the detection of a gravitational wave signal and to infer the properties of the astrophysical source.
Due to the presence of noise, and the desire to maintain the resonance condition of the optical cavities, the detectors do not directly measure the strain. The differential arm displacement is suppressed by the control force allocated to the test masses. The residual differential arm displacement in the control loop is converted into digitised photodetector output signals. Therefore, the strain is reconstructed from the raw digitised electrical output of each detector, with an accurate and precise model of the detector response to the strain. This reconstruction process is referred to as detector calibration. The accuracy and precision of the estimated detector response, and hence the reconstructed strain data, are important for detecting gravitational wave signals and crucial for estimating their astrophysical parameters.
As the global gravitational-wave detector network sensitivity increases, detector calibration accuracy and precision will play an increasingly important role. Research is being conducted at the Centre for Gravitational Astrophysics to further improve the level of accuracy and precision in detector calibration, and better integrate the calibration bias and uncertainty into astrophysical analyses.
 Sun et al., Class. Quant. Grav. 37, 225008 (2020)