On the Justification and Validity of the Kennard Inequality
American Journal of Modern Physics
In 1927, Earle Hesse Kennard derived an inequality describing Heisenberg's uncertainty principle. Since then, we have traditionally been using the standard deviation as the measure of uncertainty in quantum mechanics. But Jan Hilgevoord asserts that the standard deviation is neither a natural nor a generally adequate measure of quantum uncertainty. Specifically, he asserts that the standard deviations are inadequate to use as the quantum uncertainties in the single-and double-slit diffraction
... -slit diffraction experiments. He even tells that from these examples it will become clear that the standard deviation is the wrong concept to express the uncertainty principle generally and that the Kennard relation has little to do with the uncertainty principle. We will investigate what are adequate as the measures of quantum uncertainty. And, beyond that, we will investigate the effects of multiplying the two uncertainties; namely, characteristics which is hiding in deep interior of the Kennard inequality. Through investigations we'll come to naturally realize that his assertions were wrong. All of our discussions will help raise understanding of the Heisenberg uncertainty principle. Our discussions will afford us an opportunity to think about the essence of the Fourier transform. The aim of this paper is to draw conclusions about whether the Kennard inequality is justified or not.