In 50Gbps+ links, signal-to-noise-and-distortion ratio (SNDR) is an essential indicator that measures device and link component performance. SNDR measures the ratio between signal energy and noises from nonlinearity, uncompensated ISI, and random/unbounded noises in the link. While SNDR methodology is defined in IEEE 802.3, OIF-CEI, and soon to be included in PCI-Express Gen 6 standards, few literatures or studies were available that explains SNDR parameters, e.g. Dp (linear fit pulse delay), Np (linear fit pulse length), and test pattern, device/instrument configurations, e.g. equalization usages and waveform capturing, and how the SNDR manifests itself in the overall link performance. In this paper, we will explain and investigate the above factors with simulations and design of experiments with lab data.