A Short History Of Superconductors
The Dutch physicist Heike Onnes discovered superconductivity in 1911. He observed that mercury when cooled below 4.2 degrees Kelvin (-451.8 F) lost all of its electrical resistance. Onnes labeled this phenomena superconductivity.
It was known at this time that the resistance of a metal decreases with its temperature. And this was the case with mercury. However, the decrease in the resistance of mercury to the point of superconductivity did not proceed in the expected linear fashion. As the temperature of the mercury decreased, its resistance decreased as expected, but then at about 4.2 degree Kelvin the resistance disappeared. Onnes found the critical temperature (4.2 K) in which the resistance of the mercury suddenly disappears. Critical Temperature in a superconductor is noted as TC.
While studying the superconductive state of mercury, Onnes observed that even a weak magnetic field could quench the superconductivity. This sensitivity to magnetic fields limits the amount of current that can pass through a superconductor, since the magnetic field generated by the current itself will extinguish the superconductivity if too great. Critical magnetic field that quenches superconductivity is noted as HC.
Scientists continued to study the superconductivity. They found twelve metals that become superconductive. Even some common metals like lead and tin become superconductive if sufficiently cooled.
In 1933, K.W. Meissner and R. Oschenfeld discover that superconductors are strongly diamagnetic. Meaning they are repelled by magnetic fields. In 1945, Russian physicist V. Arkadiev nicely demonstrated this property by levitating a small bar magnet above the surface of a superconductor. This has become the classic experiment demonstrating the Meissner Effect.
The largest impediment to superconductors is the extremely low temperature needed to make a material superconductive. The critical temperature of superconductors was raised only slightly in the next 75 years since its discovery in 1911. In 1986 it had risen only to 23 K for a niobium alloy. Then a number of breakthroughs began in 1986. Starting with K. Mueller and J Bednorz of IBM Zurich, who created a ceramic superconductor then achieve superconductivity at 30 K. Several other labs began work on ceramic superconductors and the critical temperature need for superconductivity was raised above the temperature of liquid nitrogen (77 K / -196 C / -321 F).
Being able to use liquid nitrogen for superconductivity is a tremendous boon to the technology. Primarily liquid nitrogen cools 20 times more effectively than liquid helium and liquid nitrogen only costs 1/10 the cost of liquid helium. Since it has become that much more economical to use superconductors in equipment and for experimentation.
The ceramic superconductors we're using need to be cooled with liquid nitrogen to become superconductive.