Measuring Dew Point is Critical to High Quality Compressed Air
Dew point – the temperature at which water vapor in the air begins to condense into a liquid – is directly related to compressed air quality. If the dew point is too high there will be too much moisture in the compressed air system, which can negatively impact the quality of the produced air, as well as finished products and equipment reliability. For this reason, measuring dew point is critical to achieving high quality compressed air and efficiency and effectiveness of the compressed air system and downstream equipment.

What is Dew Point?
To operate, an air compressor pulls in ambient air, which contains water vapor. During the compression process the air temperature increases and moisture is concentrated, then cools as it leaves the compressor, causing the water vapor to condense and turn into liquid that, if not properly addressed with air dryers or air coolers, can enter the compressed air system and cause a host of issues, such as corrosion and contamination that degrade the quality of produced air and finished products and/or result in equipment failure.
Dew point is a factor here because it is the temperature at which the water vapor in the compressed air transforms from a vapor into liquid condensation. The lower the dew point temperature, the dryer the air and, conversely, the higher the dew point temperature, the more moisture the compressed air will contain.
The problems caused by high dew point temperatures and moist air include:
- Moisture-induced corrosion within the air compression system that may result in damage or failure.
- Corrosion or contamination that can degrade compressed air quality.
- Damage to the tools and equipment operated by the compressed air.
- Contamination such as corrosion, rust and bacteria entering finished products or creating issues with paints and coatings applied via air-operated equipment.
- Condensed water freezing within the air delivery system and causing machine and equipment failure.
- False process control instrumentation readings due to corrosion, rust and other contaminants captured within the system.
But what is a high dew point temperature? While dew point temperature requirements vary from system to system, for most, the dew point temperature of the compressed air should be maintained below 50°F. So, a high dew point temperature would be above 50 °F in general industry; however, there are some applications in which a high dew point temperature is much lower than the general 50°F. See more on this below.
Dew Point vs. Humidity
In addition to dew point temperatures, air compressor users should also be concerned with relative humidity because, although the two are different, they are related.
Relative humidity is the measure of how much water vapor is in the air relative to the maximum amount of moisture it can hold.
But why is relative humidity important in relation to dew point temperature? The amount of moisture in the compressed air system is affected by the humidity of the intake air. Higher humidity levels in the intake air result in more water vapor being drawn into the system. And, if the system is operating at high dew point temperature, the saturated air will condense and result in excess moisture that can enter the air compression system and cause corrosion and contamination that may result in low-quality compressed air and equipment failure.
Measuring Dew Point to Ensure Quality Compressed Air
In many industries, measuring dew point is essential to preventing corrosion and contamination within the air compression system. And, in industries such as food and beverage, electronics, medical device and pharmaceutical manufacturing where compressed air quality testing is required to guarantee the purity of the air, dew point monitors are critical to ensuring compliance, as well as high-quality compressed air and safe finished products.
In applications that must meet ISO classes for air purity, it is crucial to meet the dew point requirements for allowable water content as defined by ISO standard 8573. Under the ISO standard, compressed air in applications with purity Classes 6 and below must meet strict vapor pressure dew point requirements. For example, in Class 6 applications, the vapor pressure dew point must be less than 50°F and, in Class 1 applications, the vapor pressure dew point must be less than -100°F.
Fortunately, measuring dew point is relatively simple. To ensure quality compressed air in all industries, a continuous dew point monitor will tell you the dew point at which the system is currently running and allow users to be certain they are maintaining the appropriate moisture level. Other options include a dew point sensor, which measures the temperature and traces moisture to determine the point when condensation begins.
However, if the application is very sensitive, a standalone dew point meter, such as a hygrometer can precisely measure moisture levels in the process air. Some hygrometers measure moisture in the air using evaporative cooling or a material that contracts and expands with changes in humidity. Digital dew point meters and hygrometers use sensors to measure moisture levels to accurately pinpoint the dew point.
By measuring dew point it is possible to prevent moisture-induced corrosion that can impact the integrity of the treated air. And in applications that must adhere to standards that necessitate dew point monitoring as part of compressed air quality testing, it is essential for compliance and safety.
For assistance measuring dew point in your compressed air system or lowering the moisture content in your treated air, contact the pros at JHFOSTER, a Tavoron company, who can help you find the right dew point monitor, dew point meter or air-drying solution for your application.