# Temperature – Definition & Units

Temperature is defined as the degree of hotness of a substance. A difference in temperature is required for heat to flow. The direction of heat flow will be from higher temperature to lower temperature. This is very much in analogy with water flowing from a higher pressure to a lower pressure. The difference in temperature is analogous to difference in pressure and flow of heat is analogous to flow of heat. It is worth remembering that temperature and heat are related but represent different meanings.

Temperature is measured in temperature scales. The instrument is called a thermometer. The conventional ones are mercury thermometers and digital thermometers are now more in use.

Celsius Scale – Celsius scale reads 0 when ice melts in atmospheric pressure and reads 100 when water evaporates to steam in atmospheric pressure. The difference between the melting of ice to boiling of water is divided in to 100 equal divisions and each division is equal to 1 °C

Fahrenheit Scale – Fahrenheit Scale denotes 32 °F for melting point of ice and 212 °F when water evaporates at atmospheric pressure. The difference between the melting and boiling points are divided in to 180 sections and each division equal to 1 °F

To convert temperature expressed in °C to °F , use the below formula
°F = 1.8 x °C + 32

To Convert temperature in °F to °C
°C = (°F – 32)/1.8

Some important values of temperature used in practice is tabulated below in both Celsius and Fahrenheit scale below:

To measure temperature, Keep the thermometer bulb in air stream or liquid of which the temperature is to be measured. In case of a digital thermometer, place the sensor point and read directly and note the unit as well.
Good understanding of the desired value of temperature for each application is essential as by adopting to the most desirable temperature shall often help reducing excess safety margin and reduce capital and operating cost of many applications. 