Best Practice in Relative Humidity Calibration: Principles, Methods and Implementation

Dew Point Illustration On Glass

The Importance of Measuring Humidity

Relative humidity (RH) is a key environmental parameter affecting numerous domains. It plays a key role in food and seed preservation, influences pharmaceutical product shelf life, and is vital for climatic control in a variety of manufacturing plants.

Errors in humidity readings can result in compromised product quality, unstable processes, or non-compliance with regulatory thresholds. In any context where environmental control is critical, reliable RH measurement is essential.

Understanding Relative Humidity and its Dependencies

Relative humidity is defined as the ratio of the partial pressure of water vapor in the air to the maximum possible pressure at the same temperature, expressed as a percentage. Or, put more simply, how much water vapor there is in the air compared to how much there could be before condensation occurs.

It is highly dependent on temperature and dew point, meaning that even minor fluctuations in these variables can cause significant changes in RH. This sensitivity underlines the necessity of stable environmental conditions when calibrating humidity sensors.

Calibration: Definition and Engineering Relevance

Calibration is the process of comparing the output of a measurement instrument with a known reference under defined, controlled conditions where equilibrium has been reached. It does not involve altering or adjusting the instrument under test. Instead, it quantifies any deviations from the reference, allowing users to determine performance, apply corrections if necessary, and assess whether the instrument is suitable for continued use.

Through calibration, important characteristics such as measurement error, repeatability, temperature coefficients, and long-term drift can be evaluated. This process forms the basis for assigning uncertainty values and validating instrument performance, particularly in regulated environments where accuracy and traceability are critical.

Overview of Humidity Calibration Methods

Several established techniques are used for humidity calibration, depending on the desired accuracy, operational conditions, and sensor types.

  1. 1.) Non-saturated salt solutions are commonly used in compact stainless-steel chambers to produce fixed humidity points. These solutions are based on aqueous salts with known vapor pressure properties and are typically supplied with traceable certificates. They are temperature dependent and usually achieve uncertainties in the range of 0.5% to 1.3%rh.

  2. 2.) Saturated salt solutions produce stable reference environments over time. The resulting vapor pressure – and thus RH – is constant under stable temperature conditions. However, care must be taken to maintain the presence of undissolved crystals, as drying or over-wetting can affect the performance.

  3. 3.) Humidity-only calibrators, like the HC100A from Rotronic, are portable devices that allow multiple sensors to be assessed simultaneously at ambient temperature. These systems cover a broad RH range and are well suited for field applications or batch checks.

  4. 4.) Humidity and temperature calibrators, such as the Rotronic HygroGen2, offer enhanced capability, enabling RH calibration across a range of controlled temperature conditions.

These systems support high-stability environments and are critical for reducing uncertainties during calibration. Their flexibility makes them particularly useful for in-situ or process-related calibrations.

HygroGen 2

Chilled Mirror Hygrometers: A High-Precision Reference

A chilled mirror hygrometer offers a reliable and repeatable reference for RH calibration. It operates by cooling a mirror surface until condensation occurs. At equilibrium – where condensation and evaporation rates are equal – the mirror temperature is equal to the dew point. This is measured directly using a platinum resistance thermometer, without the need for calculated or inferred variables.

Due to their low drift and inherent repeatability, chilled mirror hygrometers are widely accepted as a high-quality reference standard, particularly in laboratory settings where precision and consistency are paramount. Michell offers a range of chilled mirrors, including the Optidew and the S8000 Remote.

S8000-Remote-3

Verifying Instruments: The Role of Single-Point Checks

It is important to distinguish between a single-point check and a full calibration. A single-point check involves comparing a humidity sensor to a calibrated reference at one prevailing condition. While this can be useful as a quick validation or functional test – especially in mechanical or HVAC systems – it does not qualify as a calibration.

True calibration requires comparison at a series of conditions, conducted under equilibrium, in order to fully characterize the instrument's performance. Single-point checks should never replace formal calibration where accurate and traceable data is required.

Conclusion

Relative humidity calibration is a fundamental part of maintaining data quality, system reliability, and regulatory compliance. By implementing proven calibration methods, working under controlled conditions, and understanding the physical dependencies of RH, engineers can ensure the long-term accuracy and functionality of their measurement systems.

Author: Kasia Szewczyk, Chilled Mirror and Calibration Product Manager

Related Information

Calibration and Metrology Laboratories

Calibration Services

Related Categories

Chilled Mirror Hygrometers

Related Blogs

What is Traceability and Why do you Need it?

Chilled Mirror Hygrometers for Calibration Laboratories

What Equipment do you Need for Reliable Humidity Calibrations?

Latest Loop Calibration Techniques and Digital Hot-Swap Probes to Save Costs and Reduce Downtime

What are the Types of Moisture Calibration?




< Back to Knowledge Base