Skip to main content
Ages 13+
Under 13

A Beginner’s Guide to Quantum Physics (Courtesy of a Nobel Prize Winner)


Nobel Prize in Physics Hans Bethe offered courses where he explained the basic principles of Quantum physics, the theory that revolutionised the way in which we understand a fragment of reality.

In the 90s, Quantum Physics unexpectedly entered the sphere of everyday knowledge –– A phenomenon that could maybe be explained by the unsettling discoveries made in the field surrounding what we usually refer to as ‘reality’.

Theoretically at first, and then through complex experiments, quantum physics began to reveal that on a subatomic level, properties like mass and energy behaved in very different ways, which at times seemed to be almost magical. Scientists unexpectedly discovered, for example, that observing a phenomenon affected its performance, a paradoxical situation that for centuries, according to the laws of Newtonian physics, had been considered impossible.

Nonetheless, the popularization of certain principles and phenomena do not represent the integral complexity at the heart of their developments. Perhaps it is wiser to seek a reliable source that explains Quantum matters as clearly as possible, perhaps sacrificing some of the technical hardships but not the scientific rigour.

This is precisely where Hans Bethe’s classes come into play: the German-born nuclear physicist who eventually became an American citizen who, as well as collaborating in the Manhattan Project alongside Richard Feynman and other equally important physicists, received the Nobel Prize in Physics in 1967.

In 1999 when Bethe was 93 years-old, he gave a series of Quantum Physics lectures in Kendal, a retirement community in Ithaca that undoubtedly was very different from the classrooms in Cornell University, where he had been a professor.

In his lectures, the scientist explains quantum physics by turning to its history and explaining why its study is relevant today. In the same way, he explains the principles of Heisenberg’s Incertitude and the Pauli Exclusion Principle.

Related Articles