The Sharpe ratio is the most widely used metric for comparing performance across investment managers and strategies, and the information ratio is as commonly used to evaluate performance relative to a benchmark. Although it is widely recognized that non-linearities arising from the inclusion of options or the deployment of dynamic trading rules may distort these performance metrics, most analysts are unaware of another, perhaps more serious, source of distortion. Most analysts, either consciously or unthinkingly, assume that standard deviations scale with the square root of time and correlations are invariant to estimation intervals. These assumptions are not supported by evidence. Instead, non-zero lagged auto- and cross-correlations render these performance metrics highly sensitive to the return intervals used to estimate them. As a consequence, an investment manager who appears in the top quartile based on performance metrics estimated from monthly returns may appear in the bottom quartile within the same measurement period and universe based on the same performance metrics estimated from longer-horizon returns. Of particular note, the popular investment strategy known as risk parity, contrary to prior evidence, is shown to have significantly underperformed a 60/40 stock and bond portfolio when accounting for lagged auto- and cross-correlations. Finally, evidence suggests that high-frequency variability arises from changes in discount rates, whereas low-frequency variability is related to differences in cash flows.