Before the invention of the thermometer, concepts like “heat” and “cold” were considered fundamental qualities of the elements: earth, air, fire and water. What are now considered degrees were subjectively associated with physical realities, such as a physician considering a high body temperature as “fever heat”.
With the onset of a movement to explain nature within a mathematical system in the seventeenth century, investigators began to puzzle over how to accurately measure heat. The first attempt at a measuring instrument is usually attributed to Galileo, who created the thermoscope. This device, made from a thin glass tube open on one end with a closed bulb at the top, containing only air, was lowered into a recipient containing a liquid, often wine. By heating the air inside the bulb at the top, the air within the tube would expand, lowering the fluid level. Allowing the air in the bulb to cool would contract air, causing the liquid to rise.
These instruments were not marked with degrees and were not standardized. The amount of liquid that rose would differ according to air pressure as well, making them a type of barometer. A more accurate device would need to be created if actual scientific study of temperature were to be carried out.
The Liquid-in-glass Thermometer
Experimenters in the 1630s began developing the precursors to what is considered a thermometer today by filling them with liquid. Ferdinand II, Grand Duke of Tuscany, filled a enclosed glass tube with colored alcohol which would mark increases of temperature when placed near or in hot liquids. The accuracy of such instruments was questionable, as little was understood about the expansion of liquids, and glass production was not standardized.
The calibration of degrees was first introduced by Robert Fludd in 1638, thus transforming the thermoscope into the first air thermometer. The first use of the word “thermometer” is attributed to French Jesuit priest and mathematician, Jean Leurechon, who attributed a scale of eight degrees to the device.
The Mercury Thermometer
In 1741, Gabriel Fahrenheit created a mercury-filled thermometer. Because mercury expands more dramatically and predictably with increases in temperatures, the combination of this heavy metal with better glass making made his thermometer the most accurate of its time. This type of thermometer has been used until only recently.
Fahrenheit’s scale was based upon the melting point of ice in salt water and human body temperature. In 1742, Swedish scientist Anders Celsius developed a scale which divided the freezing and boiling points of water into 100 equal degrees.
Further development in scaling came in 1848 when Sir William Thomson, Lord Kelvin of Scotland, placed the lowest temperature, that of molecular inactivity, at 0º. This calibration, known as Kelvin, is currently used in scientific temperature measurement.
With the advent of electronics, mercury thermometers have slowly been phased out, even prohibited in some countries, due to the toxic nature of mercury and the danger it poses as a waste. In addition, modern thermometers are designed not only to read temperatures in a more rapid fashion, taking just seconds for an accurate reading, as opposed to minutes involved in the use of even the most advanced mercury thermometer, but also to keep multiple registers of temperatures taken. Though registering thermometers existed as far back as 1782, with James Six’s mechanical registry of highest and lowest temperatures, modern electronic thermometers can save a wide range of data.
Joshua Evanston is a freelance writer who typically focuses on gadgets and gadget accessories. In addition to the thermometer, Joshua also likes the iPad and recommends that fellow owners should consider the kensington ipad keyboard case by visiting kensington.com.
Image credit goes to Hypnotic Love.