This work package will address the issue of galaxy shape measurement, a central task for weak gravitational lensing. If galaxy orientations are randomly distributed, their ellipticity averages out intrinsically. An observed non-vanishing orientation is a local estimate of shear induced by gravitational lensing of the intervening large-scale structure. Measuring shear gives us unbiased information about the dark-matter distribution in the Universe.

A crucial task of shape measurement is calibration, since generally the estimated shear is biased. The science requirements for LSST/Rubin, Euclid, or WFIRST/Roman are very challenging, demanding calibration at the 0.1% level. A major source of bias are blended objects (Euclid Collaboration et al. 2019), and one of the main tasks of this WP is to quantify the residual biases as a metric of the deblending techniques developped in WP3.

We will explore the recently introduced metacalibration technique (Sheldon and Huff 2017), which allows to calibrate the measured shear from the images directly, without the need of image simulations. We will quantify the performance of this technique in the presence of blended objects at different redshifts. Further, we will develop Bayesian inference techniques that do not require the measurement of individual galaxy shapes like in Bernstein and Armstrong 2014 and Bernstein et al. 2016.

Similar methods have been implemented in the field of mining agronomic landscapes where external pressures “shear” the plot mosaic. When a medium size territory encounters external pressures like biodiversity preservation or a contrario gene fluxes, the crop mosaic is distorted by the new farmers decisions (Schaller et al. 2012). Stochastic models combined with a a priori set of decision rules have given interesting results in this domain.

There has been early attempts at using neural networks to address bias in shear measurements (Gruen et al. 2010), or on building a low-bias shear estimator on individual galaxies using measured features (Tewes et al. 2019). Although using deep learning at the pixel level is starting to be used for photo-z (see WP5), or for measuring galaxy features (Tuccillo et al. 2018; Huertas-Company et al. 2018), this is still mostly uncharted territories for weak lensing.


This work will be led by DAp with contributions from LORIA and APC.