A resistance-based thermometer is commonly used in the industry to measure tempe
ID: 1939459 • Letter: A
Question
A resistance-based thermometer is commonly used in the industry to measuretemperature. When the temperature of the sample under test increases, so does the
resistance of the thermometer. Thus by measuring the resistance, the temperature
can be determined. For a typical thermometer, the resistance is known to increase by
10 Ohms for every 1 K rise in temperature. If the temperature of the sample changes
from 60 °C to 50 °C, what is the expected change in resistance? What if the
temperature changes from 60 °F to 50 °F
Explanation / Answer
The basic differentiator between metals used as resistive elements is the linear approximation of the R vs T relationship between 0 and 100 °C and is referred to as alpha, a. The equation below defines a, its units are ohm/ohm/°C. R0 = the resistance of the sensor at 0°C R100 = the resistance of the sensor at 100°C Resistance thermometers are constructed in a number of forms and offer greater stability, accuracy and repeatability in some cases than thermocouples. While thermocouples use the Seebeck effect to generate a voltage, resistance thermometers use electrical resistance and require a power source to operate. The resistance ideally varies linearly with temperature
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.