We've had a number of design discussions and done some extensive testing on enclosure heating/cooling solutions for various panel builds installed outdoors in various climates (and likely much more testing to come). Overall panel dissipation is not high (~100W) and enclosures are around 2x3ft. Enclosures are NEMA 4.
For now we have settled on a solution using a programmable relay and combo analog temp/relative humidity sensor to set trigger temps for heater and cooling system on/off. This is nice because we can make remote adjustments if need be, as opposed to using mechanical thermostats. We try to maintain a max temp of around 97F (as recommended in numerous enclosure manufacturers' manuals) and min temp of around 65F (chosen somewhat randomly based on information from a manual from Rittal).
Currently we are only using the temperature for control, the relative humidity is only for read out and reference. However, there has been some debate about whether there is value in using humidity for control of heating as well. Condensation is really the major issue I see here; the problem is we don't have access to dew point data in the relay. One thing I thought about was using one of the dew point correlation equations to calculate dew point from temperature and relative humidity, then use this data to turn on the heat to keep above the calculated temperature. However, this may be way overkill and I'm a little concerned about the reliability of that data.
So my questions:
1) Do you feel it's worth using the relative humidity for heater control? Or just set it by temp and forget it?
2) Is the lower end temp of 65F reasonable or should this be cranked up to simply maintain, say, 80F+ temps at all times (which would do better to mitigate any condensation issues)?
I know this is very general (I'm thinking of a rather specific climate example for the above application, and know any answers will probably vary largely by climate), but any input would be appreciated.
For now we have settled on a solution using a programmable relay and combo analog temp/relative humidity sensor to set trigger temps for heater and cooling system on/off. This is nice because we can make remote adjustments if need be, as opposed to using mechanical thermostats. We try to maintain a max temp of around 97F (as recommended in numerous enclosure manufacturers' manuals) and min temp of around 65F (chosen somewhat randomly based on information from a manual from Rittal).
Currently we are only using the temperature for control, the relative humidity is only for read out and reference. However, there has been some debate about whether there is value in using humidity for control of heating as well. Condensation is really the major issue I see here; the problem is we don't have access to dew point data in the relay. One thing I thought about was using one of the dew point correlation equations to calculate dew point from temperature and relative humidity, then use this data to turn on the heat to keep above the calculated temperature. However, this may be way overkill and I'm a little concerned about the reliability of that data.
So my questions:
1) Do you feel it's worth using the relative humidity for heater control? Or just set it by temp and forget it?
2) Is the lower end temp of 65F reasonable or should this be cranked up to simply maintain, say, 80F+ temps at all times (which would do better to mitigate any condensation issues)?
I know this is very general (I'm thinking of a rather specific climate example for the above application, and know any answers will probably vary largely by climate), but any input would be appreciated.