X-ray grating interferometry is one among various methods that allow extracting the so-called phase and visibility contrasts in addition to the well-known transmission images. Crucial to achieving a high image quality are the absorption gratings employed. Here, we present an in-depth analysis of how the grating type and lamella heights influence the final images. Benchmarking gratings of two different designs, we show that a frequently used proxy for image quality, a grating's so-called visibility, is insufficient to predict contrast-to-noise ratios (CNRs). Presenting scans from an excised rat lung, we demonstrate that the CNRs obtained for transmission and visibility images anti-correlate. This is explained by the stronger attenuation implied by gratings that are engineered to provide high visibilities by means of an increased lamella height. We show that even the visibility contrast can suffer from this effect when the associated reduced photon flux on the detector is not outweighed by a corresponding gain in visibility. Resulting in an inevitable trade-off between the quality of the two contrasts, the question of how an optimal grating should be designed can hence only be answered in terms of Pareto optimality.