key: cord-016246-qqrv1npv authors: Grodzinsky, Ewa; Sund Levander, Märta title: History of the Thermometer date: 2019-08-23 journal: Understanding Fever and Body Temperature DOI: 10.1007/978-3-030-21886-7_3 sha: doc_id: 16246 cord_uid: qqrv1npv The temperature of the human body has been used as a diagnostic sign since the earliest days of clinical medicine. The earliest thermal instruments were developed during the sixteenth and seventeenth centuries. In 1665, it was suggested that the melting point of ice and the boiling point of water should be the standard. The most common scales today are the Fahrenheit, Centigrade, and the Kelvin scales. Since the earliest days of medicine, physicians have recognized that the human body can exhibit an abnormal rise in temperature, usually defined as fever, as an obvious symptom of illnesses. In 1868, Wunderlich established that the temperature in a healthy person is constant and that variation of temperature occurs in disease. The Allbutt thermometer was the first practical device to become commercially available. The technology has then improved to provide highly accurate devices, for example, thermal imaging; its use is still growing in medicine. The earliest thermal instruments were developed during the sixteenth and seventeenth centuries. These simple instruments were constructed so as to trap air in glass tubes with the open end of the tube submersed in a reservoir of water. These open thermometers were termed thermoscopes. In 1610, Galileo used wine instead of water and was one of the first to use an alcohol thermometer. It was, of course, found that when carrying such a device up a mountain to a different altitude that the level in the tube was affected by the changing atmospheric pressure. These devices illustrated changes in sensible heat, before the concept of temperature had been recognized. While it is sometimes claimed that Galileo was the inventor of the thermometer, what he actually produced was a thermoscope. He did discover that glass spheres filled with aqueous alcohol of different densities would rise and fall with changing temperature. Today, this is the principle of the Galilean thermometer, which is calibrated with a temperature scale. The first illustration of a thermoscope showing a scale, which therefore can be described as a thermometer, was by Robert Fludd in 1638. However, around 1612, Santorio Santorio calibrated the tube and went on to attempt to measure human temperature with his thermoscope. At the end of the sealed tube, he had a bulb blown of the optimal size to be inserted in the mouth. The open end was submersed in fluid. As the air expanded due to the oral temperature, fluid was expelled from the tube. After a fixed period of time, the bulb was removed, the air cooled, causing the fluid to rise in the calibrated tube ( Fig. 3.1 ) [1] . The ThermomeTer In 1654, Ferdinand II de' Medici, Grand Duke of Tuscany, produced sealed tubes with a bulb and stem that were partly filled with alcohol. This was the first thermometer to depend on the expansion and contraction of a liquid, which was independent of barometric pressure. Many variants of this concept appeared, each unique as there was no standard scale. Christian Huygens in 1665 suggested using the melting point of ice and the boiling point of water as standards. The Danish astronomer Ole Rømer in Copenhagen used these upper and lower limits for a thermometer that he used to record the weather. There was still uncertainty about how well these parameters would work at different geographical latitudes. In 1694, Carlo Renaldini suggested that the ice and boiling water limits should be adopted as a universal scale. In England, Isaac Newton proposed in 1701 that a scale of 12 °C could be used between melting ice and body temperature! In 1724, a German instrument-maker named Gabriel Fahrenheit produced a temperature scale that now bears his name. He manufactured high-quality thermometers with mercury (which has a high coefficient of expansion) with an inscribed scale with greater reproducibility. It was this that led to their general adoption. Fahrenheit first calibrated his thermometer with ice and sea salt as zero. Salt water has a much lower freezing point than ordinary water, so he chose the freezing point as 30 °F. The temperature inside the healthy human mouth was 96 °F, and he established the boiling point of water at 212 °F. He later adjusted his freezing point to 32 °F, so he established 180 °F between boiling and freezing which he measured at sea level [2] . In Uppsala, Sweden, Anders Celsius (1701-1741) had been involved in meteorological observations as an astronomy student. There were at that time a large number of different thermometers, all with different scales. He may have already at that early stage in his career realized that there was a need for a common international scale. He was appointed as professor of astronomy at Uppsala (as his father had been before him) and was involved in meteorological surveys. Celsius was the first to perform and publish careful experiments leading to the establishment of an international temperature scale based on scientific data. (He was for many years secretary of the Royal Society of Sciences at Uppsala.) His paper 'Observations of two persistent degrees on a thermometer' described his detailed experiments to check that the freezing point is independent of latitude and atmospheric pressure. He also determined the dependence of boiling water on atmospheric pressure and gave a rule for the determination of the boiling point if the barometric pressure deviates from a standard pressure [3] . Why would a student of astronomy be interested in scales of temperature measurement? The position of zero was much discussed. The scale used by Ole Rømer placed zero at the lower temperature. Celsius had also used a thermometer created by the French astronomer Joseph-Nicolas Delisle with zero at the boiling point, thus giving a reversed scale with increasing numbers for decreasing temperatures, which avoided having negative values. The reversal of this centigrade scale, placing zero at the freezing point, was inevitable and occurred a few years after Celsius's death. Various names are associated with this change. While Linnaeus is often credited, the history of thermometers in the proceedings of the Royal Swedish Academy of Sciences for 1749 mentions Celsius, his successor Strømer, and the instrument-maker Ekström in connection with the direct scale. No single person was given credit. A century later, Carl August Wunderlich stated in the English translation of his treatise on 'Temperature in Diseases' that he preferred to retain all his measurements in the centigrade scale, because the convenience of this scale will probably shortly lead to its general adoption by all scientific men. Celsius is now internationally recognized for his major contribution through his careful experiments and in using fixed points for calibration. This was recognized by the adoption in 1948 by an international conference on weights and measures of the preferred scale for temperature now referred to as degree Celsius (°C). Imagine the setting for scientific discussions and the dissemination of new knowledge at the time of Linnaeus and Celsius. In Scotland in 1848, Lord Kelvin realized in his study of heat that a much greater range of temperature could be considered, far beyond the centigrade scale. Absolute zero, the level at which all molecular motion stops, gives the lowest conceivable temperature that can be found. This he determined to be −273.16 degrees on the centigrade scale and −459.67 degrees on the Fahrenheit scale. Therefore, the lowest temperature on the Kelvin scale is 0, and the units are the same as the centigrade (Celsius) scale. While this scale is not used in clinical medicine, it may sometimes be used to define a temperature calibration source or similar scientific system. Thomas Seebeck, who was born in Estonia in 1770, is the person most closely associated with the thermocouple as a temperature-measuring device. In 1820, when at the Berlin Academy of Sciences, he had studied the magnetic influence of an electrical current. A year later, he announced his discovery that two different metals forming a closed circuit will display magnetic properties when there is a difference of temperature between the two points of contact. This, the Seebeck effect, is the basis of all thermoelectricity and led to the development of thermocouples for contact temperature measurement. In recent years, this technology has been improved to provide highly accurate heat-measuring devices capable of measurement from a few degrees above absolute zero to high temperatures over 1600 °C (2912 °F). Their main applications generally fall outside the temperature range of the human body, but some patient-monitoring devices used in critical care employ thermocouples taped to the skin for continuous measurements over time. Thermocouples and thermistors are also used in sealed catheters for internal body temperature measurements [4] . The first non-contact radiometer designed to measure body temperature in the inner ear canal was invented in 1964 by Theodor Benzinger. When doing research on human temperature regulation at the US Naval Medical Research Institute in Bethesda, Benzinger developed a small radiometer to measure as close as possible to the brain. This was intended to be a non-invasive procedure, to avoid attaching electrodes to the hypothalamus [5] . The first systems were produced in the United States, Europe, and Japan in the early 1990s and have been increasingly adopted as a routine instrument for clinical thermometry (Fig. 3.2 ). Since the earliest days of medicine, physicians have recognized that the human body can exhibit an abnormal rise in temperature, usually defined as fever, as an obvious symptom of certain illnesses. For example, the Bible has early references to fever in the Book of Job, and there are descriptions of 'burning bones' in the book of Psalms. Physicians were aware of the use of their hand as a standard means for estimating temperature. Hippocrates noted that the temperature of the body was important and insisted that physicians should be able to recognize the signs of abnormal temperature. He taught that steps should be taken to raise the temperature where it is depressed and lower it when raised. Galen (AD 131-201) described fever as calor praeter naturam or preternatural heat. As already noted, the first attempts to measure the temperature of a human body seem to have occurred in the sixteenth and seventeenth centuries and then first in Italy. Giovanni Borelli, who had the support of Queen Christina of Sweden, was a pioneer of biomechanics and studied movement in animals. He is reputed to have tried many different measurements of the inner organs of live animals long before anaesthetics were available [6] . Santorio Santorio made an elaborate form of oral thermoscope to study human body temperature, although probably only with limited success. Herman Boerhaave (1668-1738) and his pupils Gerard van Swieten and Anton de Haen noted the value of Fahrenheit's thermometer after it became available in 1714. Van Swieten became a professor of medicine at the University of Vienna and recommended that fever should be measured with a thermometer rather than with the hand. He applied the mercury thermometer to both the mouth and axilla as recommended by Fahrenheit. Anton de Haen taught clinical practice at the Vienna General Hospital and emphasized to all his students of the importance of measuring body temperature in fever. He pointed out that a physician's touch was inadequate, especially when a shivering patient complained of extreme coolness while registering a temperature that was three or more degrees above normal. Unfortunately, his studies were scattered throughout his 15 volumes of publications, Ratio Medendi (1757-1773). These included observations on temperature related to diurnal fluctuations, in the elderly, and on the action of certain drugs. De Haen's detailed observations, just part of his extensive work, went largely unheeded [7] . Excellent work on the temperature of healthy people and animals had been published by George Martine (1702-1741), a physician who had studied in Edinburgh and Leiden. He theorized that animal heat was the result of the velocity of blood moving through the vessels. His work inspired many others including John Lining in 1748 on temperature in those suffering from malaria, and John Hunter (1728-1793) one of the great surgeons and pioneers of the circulatory system. Hunter subsequently disagreed with Martine, claiming that 'warmth depends on a different principle, which is intimately connected with life itself, and is a power which maintains and regulates the machine, independent alike of the circulation, the will, and of sensation' [2] . Many of the early thermometers were of doubtful accuracy, and frequently they were inconveniently large. However, by 1835 Becquerel and Breschet were able to establish the mean temperature of a healthy adult to be 37 °C (98.6 °F). By the 1860s, the use of the thermometer had become more common, and the physiological significance of body temperature was becoming clearer. By 1863, John Davy had noted the variations in temperature resulting from exercise, the intake of food and drink, the influence of external temperature, and the differences in body processes in children. By this time, it was recognized that in many situations, temperature was a better clinical indicator than the pulse, because it was not affected by nervous activity or excitement. Reflect on the statement that body temperature is not affected by nervous activity or excitement. During this period of increasing interest in thermometry, Carl Reinhold Wunderlich (1815-1910) published his major work on Temperature in Diseases when at Leipzig in 1868. This was published in an English translation in 1871 [8] . His treatise was based on regular temperature measurements made on all his patients over the course of 15 years, some as many as four to six times daily. After some 100,000 observations, Wunderlich showed that when the temperatures were plotted on charts, the disease could be shown to follow certain laws, which could be characterized by the trend in temperature. Overall, he studied some 25,000 specific cases. This was clearly a significant contribution to the subject and places Wunderlich at the forefront of discovery in this aspect of clinical observation. He had established that the temperature in a healthy person is constant and that variation of temperature occurs in disease. From this, Wunderlich laid down a code based on principles that he had derived from his large set of observations. By this time it was considered that 'a physician who carried on his profession without the thermometer was like a blind man endeavouring to distinguish colours by feeling'. In the first chapter of his book, Wunderlich lists 40 precepts of human body temperature, most of which remain unchallenged in modern medicine. Here are some examples: The temperature of healthy person is almost constantly the same, although not absolutely. So, indeed there are spontaneous variations in the course of every twenty-four hours, but these seldom exceed half a degree of the centigrade scale. A normal temperature does not necessarily indicate health, but all those whose temperature either exceeds or falls short of the normal range are unhealthy. The range of temperature in the most severe diseases is between 35 °C (95 °F) and 42.5 °C (108.5 °F), and it is very seldom that it exceeds 43 °C (109.4 °F) or sinks below 33 °C (91.4 °F). Alterations of temperature may be confined to special regions of the body, which are the seat of disease actions (local inflammation), while the general temperature remains more or less normal. A rapid increase in temperature of the body from a chill, or in the normal warmth of the hands, feet, nose, or forehead, is commonly associated with strong feelings of chilliness and convulsive movements ('cold shivers', rigors, 'fever-frost'). A more or less permanent and noticeable rise in temperature amounting to 38.5 °C (101.3 °F) or more, is generally accompanied with subjective feelings of heat, and lassitude, as well as with thirst and headache…and rapidity of the pulse …('feverishness', pyrexia, fever). When there are extremes of temperature, we know that there is great danger. High fever is indicated by temperatures above 39.5 °C (103.1 °F) in the morning, and above 40.5 °C (104.9 °F) in the evening. Temperatures in every known disease except relapsing fever, in all probability, indicate a fatal termination (42 °C [107.6 °F] or more-hyperpyretic temperatures). Abnormally low temperatures may seriously disturb the various functions of the body; and when the fall is very considerable, it may render the continuation of life impossible [8] . These extracts are abridged from the very detailed description of differing types of fevers that were accepted in nineteenth-century medicine. In the full text of Temperature in Diseases, Wunderlich provides a most comprehensive list of investigators, mainly German and European, who had studied the role of thermometry in man and animals. He also discusses the various sites on the human body where thermometry may be applied. Of the many potential areas, he showed that in the hand or between fingers and toes was too unreliable. Rectal and vaginal sites were also criticized, the former being affected by masses of faeces, and the latter lacking in clinical evidence of reliability. The axilla and the mouth were advocated, with warnings of the effects of ingestion of food and drink, and or oral breathing when suffering from nasal congestion. Much of this work by Wunderlich and others was performed with large, slow thermometers sometimes requiring 20 minutes to fully register. The need for a narrow temperature range of clinical thermometer was obvious. This should also be a maximum registering thermometer, small in size, and able to be fitted into a protective case. In this way a physician could carry a stethoscope and a thermometer in his personal kit, thus increasing the use of temperature measurement in diagnosis. While different names have been associated with the arrival of the 'clinical thermometer', the Allbutt clinical thermometer was the first practical device to become commercially available. Sir Thomas Clifford Allbutt (1836-1925) was a celebrated British physician. He spent 20 years working in Leeds during which time he devised the small clinical thermometer. A local company, Harvey and Reynolds, first manufactured this special thermometer in 1867, followed by Thackeray in London. Allbutt made the design of his thermometer freely available to others, and it was quickly taken up by British physicians. It was notable in that the instrument, 15 centimetres long, had a constriction in the capillary tube that held the mercury at its reading after use, until shaken down to the lower limit of calibration ( Fig. 3.3) . The temperature reading was available in 5 minutes and initially was calibrated to 90-110 degrees on the Fahrenheit scale (32-43.3 °C). Later clinical thermometers were marked with the centigrade scale. Thomas Allbutt made several significant contributions to medicine, including the ophthalmoscope. He received royal recognition in England, being awarded a knighthood in 1907, and he was made the president of the British Medical Association in 1920 [9, 10] . Although William Herschel in Britain had identified the existence of infrared radiation in 1800, it took many years for remote heat sensing to be developed. Throughout the 1930s and 1940s, this technology came into practical use, accelerated by the needs of the military during the Second World War. In the late 1950s, once infrared technology had been declassified, thermal imaging became available to medicine and industry. Although the early systems were slow scanners, it became clear that it was possible to record the temperature distribution of a human subject or an object. An important conference in 1964 at the New York Academy of Sciences revealed the true potential of this technology in the study of human body temperature [11] . Also, in 1964 the German physician Dr Theodore Benzinger, who had moved to the United States, developed a small radiometric device to measure the temperature of the inner ear (tympanic membrane). In contrast to the very expensive early thermal imaging systems, this device promised a low cost and reliable means of measuring temperature close to the brain, but without the invasive contact of thermocouples. Initially only used for military and space technology, the tympanic radiometer came into medicine some 30 years later. This was undoubtedly stimulated by the concerns about the use of mercury on thermometers and its subsequent banning. The radiometer was further developed in the United States for measuring temperature over the temporal artery and was also used to measure forehead temperature. The latter application is not always successful, as the forehead may be a site of profuse sweating, either from physical exertion or from fever. After 50 years of ever-improving and cheaper thermal imaging, its use is still growing in medicine [12] . A significant chain of events during the severe acute respiratory syndrome (SARS) outbreak and subsequent pandemic threats of the Hemagglutinin and Newaminidase (HN) viruses has resulted in trials using thermal imaging of the face for airport screening of the travelling public. This has led to the International Standards Organization publishing documents to highlight the essential requirements of thermal imaging cameras and their optimal use in this application. From this work, it is now established that a close-up thermogram of the subject's frontal face can be used to measure the temperature of the inner canthus of the eye and so to detect fever by remote sensing (see Chap. 3) [13, 14] . Thus, the study of human body temperature continues to evolve, and the technology applied to it is still being developed [15] . Many pioneers in medicine, physiology, and the physical sciences have contributed to this story, which, inevitably, cannot be said to be over. Our knowledge of the science of the human body will doubtless continue to grow, yet the long centuries of battles against human disease have not yet come to an end. Medical thermometry -a short history Swedish astronomers 1477-1900 Acta Universitatis Upsaliensis The early history of the thermocouple Tympanic thermometry in anaesthesia and surgery Physicists and physicians Ratio Mendendi in Nosaocomio Practico Vindobonensi. Vienna: Kruckten Medical thermometry and human temperature Medical thermometry Medical thermometry Thermography and its clinical applications Infrared thermal imaging in medicine. Topical review Medical electrical equipment_Part 2-56: particular requirements for basic safety and essential performance of clinical thermometers for body temperature measurement. Switzerland: ISO/TC 121/SC 3 Lung ventilators and related equipment Medical electrical equipment: deployment, implementation and operational guidelines for identifying febrile humans using a screening thermography Radiometric temperature measurement