You are on page 1of 16

RECOMMENDED DATA CENTER TEMPERATURE & HUMIDITY PREVENTING

COSTLY DOWNTIME CAUSED BY ENVIRONMENT CONDITIONS


NOTE: An updated article with recommendations from the 2016 report can now be found
online at this link.

By Rick Grundy
November 14, 2005

Monitoring the environment conditions in a computer room or data center is critical to ensuring
uptime and system reliability. A report from the Gartner Group in late 2003 estimated that the
average hourly cost of downtime for a computer network at that time was $42,000. It has likely gone
up dramatically. At these high costs, even companies with 99.9% uptime lose hundreds of thousands
of dollars each year in unplanned downtime. Maintaining recommended temperature and humidity
levels in the data center can reduce unplanned downtime caused by environment conditions and save
companies thousands or even millions of dollars per year.

Recommended Computer Room Temperature

Operating expensive IT computer equipment for extended periods of time at high temperatures
greatly reduces reliability, longevity of components and will likely cause unplanned downtime.
Maintaining an ambient temperature range of 68 to 75F (20 to 24C) is optimal for system
reliability. This temperature range provides a safe buffer for equipment to operate in the event of air
conditioning or HVAC equipment failure while making it easier to maintain a safe relative humidity
level.

It is a generally agreed upon standard in the computer industry that expensive IT equipment should
not be operated in a computer room or data center where the ambient room temperature has
exceeded 85F (30C).

In todays high-density data centers and computer rooms, measuring the ambient room temperature
is often not enough. The temperature of the air where it enters the server can be measurably higher
than the ambient room temperature, depending on the layout of the data center and a higher
concentration of heat producing equipment such as blade servers. Measuring the temperature of the
aisles in the data center at multiple height levels can give an early indication of a potential
temperature problem. For consistent and reliable temperature monitoring, place a temperature sensor
at least every 25 feet in each aisle with sensors placed closer together if high temperature equipment
like blade servers are in use. We recommend installing TemPageR, Room Alert 7E or Room Alert 11E
rack units at the top of each rack in the data center. As the heat generated by the components in the
rack rises, TemPageR and Room Alert units will provide an early warning and notify staff for
temperature issues before critical systems, servers or network equipment is damaged.

Recommended Computer Room Humidity

Relative humidity (RH) is defined as the amount of moisture in the air at a given temperature in
relation to the maximum amount of moisture the air could hold at the same temperature. In a data
center or computer room, maintaining ambient relative humidity levels between 45% and 55% is
recommended for optimal performance and reliability.

When relative humidity levels are too high, water condensation can occur which results in hardware
corrosion and early system and component failure. If the relative humidity is too low, computer
equipment becomes susceptible to electrostatic discharge (ESD) which can cause damage to sensitive
components. When monitoring the relative humidity in the data center, we recommend early warning
alerts at 40% and 60% relative humidity, with critical alerts at 30% and 70% relative humidity. It is
important to remember that the relative humidity is directly related to the current temperature, so
monitoring temperature and humidity together is critical. As the value of IT equipment increases, the
risk and associated costs can increase exponentially.

About AVTECH Software

AVTECH Software, a private corporation founded in 1988, is a computer software and hardware
manufacturer specializing in providing network enabled monitoring hardware and Windows
NT/2K/XP/2K3 based software to monitor multi-OS computers and network issues throughout a
department or an entire enterprise. Once issues or events occur, AVTECH Software products use
todays most advanced alerting technologies to communicate critical and important status information
to remote system managers and IT professionals via mobile phones, pagers, PDAs, email, the web
and more. Automatic corrective actions can also be taken to immediately resolve issues, run scripts,
and shutdown/restart servers or applications.

AVTECH Software is now the premier worldwide manufacturer of environment monitoring equipment
specifically designed to monitor todays advanced computer rooms and data centers. Our Room Alert
and TemPageR products are used to monitor the Fortune 1000, Pentagon, United Nations, U.S.
Government, Universities, and organizations large and small, in all types of industries.
DATA CENTER & SERVER ROOM MONITORING RECOMMENDED STANDARDS & BEST
PRACTICES

Recommended
Application. Location Of Sensors. Settings.
Sensors.
Ambient Room
small rooms: center. 18-27C / 64-80F.
Temperature. Temperature &
data centers: potential
Ambient Room Humidity Sensor.
hot zones. 40% - 60% rH.
Humidity.

ASHRAE recommends 3
Rack Level Intake
per rack: front (top, 18-27C / 64-80F.
Temperature.
middle, bottom).
Temperature
less than 20C / 35F Sensor.
Rack Level ASHRAE recommends 3
difference from inlet
Outtake per rack: back (top,
temperature (typically <40C /
Temperature. middle, bottom).
105F).

settings depend on room to


ensure 18-27C temperature to Temperature &
Next to each HVAC unit rack and 40-60% rH at room Humidity Sensor.
HVAC & Airco
to monitor their working level.
Monitoring.
state.
water leak sensor under each Water leak &
HVAC unit to detect leaks. Flooding Sensor.

On this page you will find the standards recommended by ASHRAE for monitoring the environment in
your data center or server room. The settings below apply to A1-A4 class data centers and server
rooms. Environmental standards are provided for rack level monitoring, ambient monitoring
and water leak detection.
https://youtu.be/k6XEpMI9QHE

More video's

1. Rack Level Monitoring

Summary: ASHRAE recommends no less than 6 temperature sensors per rack. However Gartner
says that 3 could already be enough. Intake temperature should be between 18-27C / 64-80F.
Outtake temperature should be less than 20C / 35F compared to the intake temperature.
Background Info: Based on a recent Gartner study, the annual cost of a Wintel rack averages
around $70,000 USD per year. This excludes the business cost of a rack. Risking losing business
continuity or even your infrastructure due to environmental issues is not an option. But what are the
environmental threats at a rack level?

A mistake often made is to only rely on monitoring the conditions at a room level and not at a rack
level. The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)
recommend no less than 6 temperature sensors per rack in order to safeguard the equipment.
They should be mounted at the top, middle & bottom and this both at the front & back of rack.
When a heat issue arises, air conditioning units will initially try to compensate the problem. When you
are doing temperature monitoring at a room level only, then the issue will only be detected when
the air conditioning units are no longer capable of compensating the heat problem. It may be too late
then... Rack temperature is also not server temperature.
As Gartner stated, you could already start at monitoring rack temperature with 3 measurement
points: at the bottom front of the rack to verify the temperature of the cold air arriving to the rack
(combined with airflow monitoring); at the top front of the rack to verify if all cold air gets to the top
of the rack; and finally one at the top back of the rack which is typically the hottest point of the rack.
Intake temperature should be between 18-27C / 64-80F. Outtake temperature should be less
than 20C / 35F compared to intake temperature.
What is the impact of temperature on your systems? High end systems have auto shutdown
capabilities to safeguard themselves against failures when temperature is too high. But did you know
that CPU level errors and as such errors in your applications can be temperature induced? On top of
that heat will stress fan equipment even more. This reduces equipment life time expectance. All this
affects your system's availability and your business continuity.
For small rooms it is recommended to go with wired sensors. In larger data centers where you
need a lot of monitoring points, wireless sensors can offer a cheaper alternative. With wireless
sensors there is no cabling required. You'll also need substantially less IP addresses as 30
temperature sensors can talk to 1 base unit.

2. Ambient room monitoring

Summary: Room monitoring, the temperature has to be maintained between should be between
18-27C / 64-80F. Humidity range is 40% and 60% rH. Dew point temperature should be 5.5C
DP to 60% RH and 15C DP.
Background Info: Ambient server room monitoring or data center monitoring is the environmental
monitoring of the room for its humidity and temperature levels. Temperature and humidity
sensors are typically deployed in:

air conditioning units to detect failure of such systems.

When multiple air conditioning systems are available in a room, then a failure of one system will
initially be compensated by the others before it may lead to a total failure of the cooling system due
to overload. As a result temperature / airflow sensors are recommended near each unit to get
early failure detection.
Monitoring humidity is equally important than temperature and often omitted. Did you know that the
relative humidity (rH) in server rooms and data centers should be between 40% and 60% rH. Too dry
will result in the build up of static electricity on the systems. Too humid and corrosion will start slowly
damaging your equipment resulting in permanent equipment failures.
When using cold corridors inside the data center, then the ambient air temperature outside the
corridor may be at higher levels. Air temperature of 37C / 99F are not uncommon in such setups.
This allows to significantly reduce the energy cost. However this also means that temperature
monitoring is of utmost importance as a failing air conditioning unit will have a way faster impact on
the systems lifetime and availability (fans stress, CPU overheating, ) and running a room at higher
temperatures may also affect non rack mounted equipment.
When using hot corridors it is important to monitor temperature across the room to ensure that
sufficient cold air gets to each rack. In this case however one can also rely on rack based
temperature sensors in addition of temperature and humidity sensors close to each air conditioning
unit.

3. Water & Flooding Monitoring

Summary: Water leak sensors should be put around the perimeter of the room, under each AC unit
and under each pipe running through the server room or data center.
Background Info: Water leakage is a less known threat for server rooms & data centers. The fact
that most data centers and server rooms have raised floors makes the risk even bigger as water
seeks the lowest point.
Two type of sensors for water leakage can be commonly found: spot and water snake cable based.
Spot sensors will trigger an alert when water touches the unit. Water rope or water snake cable
sensors use a conductive cable whereby contact at any point on the cable will trigger an alert. The
latter type is recommended over the first one due to its higher range and higher accuracy.
If using a raised floor, then one should consider putting the sensor under the raised floor as water
seeks the lowest point.
The four main sources of water in a server room are:

leaking air conditioning systems: a water sensor should be placed under each AC unit
Water leaks in floors or roof above the room: water sensors should be put around the
perimeter of the room at around 50cm/3ft from the outer walls and under the raised floor.
leaks of water pipes running through server rooms: a water sensor should be placed under
the raised floors
Traditional flooding: same as second point for water leaks from roof or above floors applies.

The water leak sensors provided by ServersCheck can be daisy chained. This gives you a rope of up
to 30m to monitor for water ingress. Need more than 30m? Simply add another sensor to your
configuration. A base unit with the optional Sensorhub can support up to 8 water leak sensors for a
total of 240m of leak detection.

4. Implementing standards with ServersCheck' SNMP and Modbus sensors

All sensors connect to our SensorGateway (base unit). A base unit supports up to 2 wired sensors,
or up to 8 with the optional sensor hub.
Part
Application Location Setting Sensor Name
Number

Rack Level Monitoring

Front - Bottom of rack ENV-


Sensors to Wired or Wireless
for room or floor TEMP or
monitor intake 18-27C / 64-80F Temperature
cooling, top of rack for ENV-W-
temperature probes*
top cooling TEMP

less than 20C /


ENV-
Sensors to 35F difference from Wired or Wireless
Back - Top of rack (hot TEMP or
monitor outtake inlet temperature Temperature
air climbs) ENV-W-
temperature (typically <40C / probes*
TEMP
105F)
Ambient Monitoring

small server rooms: Temperature


Temperature &
center of the room depends on type of Temperature &
humidity ENV-
data centers: potential room setup Humidity Sensor
monitoring in THUM
hot zones - furthest Humidity: 40-60% Probe*
server room
away from airco units rH

Airconditioning Monitoring

Temperature
Early detection of
depends on setting Temperature &
failing air ENV-
next to airco units of airco Humidity Sensor
conditioning THUM
Humidity: 40-60% Probe*
units
rH

Water Leaks / Flooding

Around outside walls of


server room / data
Detecting water Flooding Sensor
center and under ENV-W-
leaks coming Probe* with
raised floor LEAK-
from outside of 6m/20ft water
best is to keep a 30- COMBO
room sensitive cable
50cm / 10-20" from
outer wall

Detecting water Flooding Sensor


ENV-W-
leaks from air Under each air Probe* with
LEAK-
conditioning conditioning unit 6m/20ft water
COMBO
units sensitive cable

As defined by ASHRAE:
Class A1: Typically a data center with tightly controlled environmental parameters (dew point,
temperature, and relative humidity) and mission critical operations; types of products typically
designed for this environment are enterprise servers and storage products.
Class A2: Typically an information technology space or office or lab environment with some control of
environmental parameters (dew point, temperature, and relative humidity); types of products
typically designed for this environment are volume servers, storage products, personal computers,
and workstations.
Class A3/A4: Typically an information technology space or office or lab environment with some
control of environmental parameters (dew point, temperature, and relative humidity); types of
products typically designed for this environment are volume servers, storage products, personal
computers, and workstations.
***********
DATA CENTER TEMPERATURE AND HUMIDITY RANGE RECOMMENDATIONS

By Carol Baroudi, Jeffrey Hill, Arnold Reinhold, Jhana Senxian

Part of Green IT For Dummies Cheat Sheet

In consulting with computer manufacturers, the American Society of Heating, Refrigerating and Air-
Conditioning Engineers (ASHRAE), changed its recommendations for air temperatures in data centers
and computer room humidity. The following chart shows new recommendations for safely maintaining
IT equipment.

Humidity Fahrenheit Celsius

High limit 80.6F 27C

Low limit 64.4F 18C

Maximum relative humidity 60 percent

Maximum dew point 59F 15C

Minimum dew point 41.9F 5.5C

Dew point depression at 60 percent R.H., 20 8.5C 14.3F


UNDERSTANDING DATA CENTER TEMPERATURE GUIDELINES

BY JULIUS NEUDORFER - MARCH 15, 2016 LEAVE A COMMENT

See Data Center Frontiers Rich Miller, Compass Datacenters Chris Crosby, DLB Associates Don Beaty
and moderator Yevgeniy Sverdlik of Data Center Knowledge for a lively discussion about data center
temperature guidelines, and the controversy surrounding the proposed 90.4 standard at Data Center
World on March 17, 2016

It is important to note that while closely followed by the industry, the TC9.9 data center temperature
guidelines are only recommendations for the environmental operating ranges inside the data center,
they are not a legal standard. ASHRAE also publishes many standards, such as 90.1 Energy Standard
for Buildings Except for Low Rise Buildings which is used as a reference and has been adopted by
many state and local building departments. Prior to 2010, the 90.1 standard virtually exempted data
centers. In 2010 the revised 90.1 standard included and mandated highly prescriptive methodologies
for data center cooling systems. This concerned many data center designers and operators, especially
the Internet and social media sites which utilized a wide variety of leading-edge cooling systems
designed to minimize cooling energy. These designs broke with traditional data center cooling designs
and could potentially conflict with the prescriptive requirements of 90.1, thus limiting rapidly
developing innovations in the more advanced data center designs. We will examine 90.1 and 90.4 in
more detail in the Standards section.
Save

Data Center Frontier Special Report on Data Center Cooling Download it Now

This article is the second in a series on data center cooling taken from the Data Center Frontier
Special Report on Data Center Cooling Standards (Getting Ready for Revisions to the ASHRAE
Standard)

Power Usage Effectiveness (PUE)


While the original version of PUE metric became more well known, it was criticized by some since
power (kW) was an instantaneous measurement at a point in time, and some facilities claimed very
low PUEs based on a power measurement made during the coldest day which minimized cooling
energy. In 2011 it was updated to PUE version 2 (which is focused on annualized energy rather than
power).

The revised 2011 version is also recognized by ASHRAE, as well as the US EPA and DOE, became part
of basis of the Energy Star program, as well as becoming a globally accepted metric. It defined four
PUE Categories (PUE0-3) and three specific points of measurement. Many data centers do not have
energy meters at the specified points of measurement. To address this issue, PUE0 still was based on
power, but required the highest power draw, typically during the warmer weather (highest PUE),
rather than a best case, cold weather measurement, to negate incorrect PUE claims. The next three
PUE categories were based on annualized energy (kWh). In particular PUE Category 1 (PUE1)
specified the output of the UPS and was the most widely used point of measurement. The point of
measurement for PUE2 (PDU output) and PUE3 (at the IT cabinet), represented more accurate
measurement methods of the actual IT loads, but were harder and more expensive to implement.
(see graphic).
Save

The Green Grid clearly stated that the PUE metric was not intended to compare data centers, its
purpose was only meant as a method to baseline and track changes to help data centers improve
their own efficiency. The use of a mandatory PUE for compliance purposes in the 90.1-2013 building
standard, and the proposed ASHRAE 90.4 Data Center Energy Efficiency standard, was in conflict with
its intended purpose. The issue is also discussed in more detail in the section on ASHRAE standards.

You can see Data Center Frontiers Rich Miller, Compass Datacenters Chris Crosby, DLB Associates
Don Beaty and moderator Yevgeniy Sverdlik of Data Center Knowledge in a lively discussion about
the warming of the data center, including how we got here, the impact of current acceptable and
allowable temperature ranges on operations and performance, and the controversy surrounding the
proposed 90.4 standard at Data Center World on March 17, 2016

Understanding Temperature References


In order to discuss evolving operating temperatures it is important to examine the differences of dry
bulb, wet bulb and dew point temperatures.

Dry Bulb
This is the most commonly used type of thermometer referenced in the specification of IT equipment
operating ranges. The dry bulb thermometer (analog or digital), readings are unaffected by the
humidity level of the air.

Wet Bulb
In contrast, there is also a wet bulb thermometer, wherein the bulb (or sensing element) is
covered with a water-saturated material such as cotton wick and a standardized velocity of air flows
past it to cause evaporation, cooling the thermometer bulb (a device known as a sling psychrometer).
The rate of evaporation and related cooling effect is directly affected by the moisture content of the
air. As a result, at 100% RH the air is saturated and the water in the wick will not evaporate and will
equal the reading of a dry bulb thermometer. However, at lower humidity levels, the dryer the air,
the faster the moisture in the wick will evaporate, causing a lower reading by the wet bulb
thermometer, when compared to a dry bulb thermometer. Wet bulb temperatures are commonly
used as a reference for calculating the cooling units capacity (related to latent heat load. i.e.
condensation- see Dew Point below), while dry bulb temperatures are used to specify sensible
cooling capacity. Wet bulb temperatures are also used to project the performance of the external
heat rejection systems, such as evaporative cooling towers, or adiabatic cooling systems. However,
for non-evaporative systems, such as fluid coolers or refrigerant condensers, dry bulb temperatures
are used.

Dew Point
Dew point temperature represents the point at which water vapor has reached the saturation point
(100% RH). This temperature varies, and its effect can be commonly seen when condensation forms
on an object that is colder than the dew point. This is an obvious concern for IT equipment. When
reviewing common IT equipment operating specifications, it should be noted that the humidity range
is specified as non-condensing.

Dew point considerations also become important to address and minimize latent heat loads on cooling
systems, such as the typical CRAC/CRAH unit whose cooling coil operates below the dew point,
therefore inherently dehumidifies while cooling (latent cooling requiring energy). This then requires
the humidification system to use more energy to replace the moisture removed by the cooling coil.
New cooling system can avoid this double-sided waste of energy by implementing dew point control.

Recommended vs Allowable Temperatures


As of 2011, the recommended temperature range remained unchanged at 64.4-80.4F (18-27C).
While the new A1-A2 allowable ranges surprised many IT and Facility personnel, it was the upper
ranges of the A3 and A4 temperatures that really shocked the industry.

While meant to provide more information and options, the new expanded allowable data center
classes significantly complicated the decision process for the data center operator when trying to
balance the need to optimize efficiency, reduce total cost of ownership, address reliability issues, and
improve performance.

Temperatures Measurements Room vs IT Inlet


As indicated in the summary of Thermal Guidelines, the temperature of the room was originally
used as the basis for measurement. However, room temperatures were never truly meaningful,
since the temperatures could vary greatly in different areas across the whitespace. Fortunately, in
2008, there was an important, but often overlooked change in where the temperature was measured.
The 2nd edition referenced the temperature of the air entering IT equipment. This highlighted the
need to understand and address airflow management issues in response to the higher IT equipment
power densities, and the recommendation of the Cold-Aisle / Hot-Aisle cabinet layout.

In the 2012 guidelines there were also additional recommendations for the locations for monitoring
the temperatures in the cold aisle. These also covered placing sensors inside the face of cabinet and
the position and number of sensors per cabinet, (depending on the power density of the cabinets and
IT equipment). While this provided better guidance on where to monitor the temperatures, very few
facility managers had temperature monitoring in the cold aisles, much less inside the racks. Moreover,
it did not directly address how to control the intake temperatures of the IT hardware.

ASHRAE vs NEBS Environmental Specifications


Although ASHRAE Thermal Guidelines are well known in the data center, the telecommunications
industry created environmental parameters long before TC9.9 released the first edition in 2004. The
NEBS* environmental specifications provides a set of physical, environmental, and electrical
requirements for local exchanges of telephone system carriers. The NEBS specifications have evolved
and been revised many times and its ownership has changed as telecommunications companies
reorganized. Nonetheless, it and its predecessors effectively defined the standards for ensuring
reliable equipment operation of the US telephone system for over a hundred years.

In fact, NEBS is referenced in the ASHRAE Thermal Guidelines. The NEBS recommended
temperature range 64.4F 80.6F (18-27C), existed well before the original TC9.9 guidelines, but
was not until 2008 in the 2nd edition, that the Thermal Guidelines were expanded to the same
values. More interestingly, in 2011, the TC9.9 new A3 specifications now matched the long standing
NEBS allowable temperature range of 41-104F. However, it is the NEBS allowable humidity range that
would shock most data center operators 5%-85% RH. The related note in the ASHRAE Thermal
Guideless states: Generally accepted telecom practice; the major regional service providers have
shut down almost all humidification based on Telecordia research.
REVISED LOW HUMIDITY RANGES AND RISK OF STATIC DISCHARGE
In 2015 TC9.9 completed a study of the risk of Electro-static Discharge ESD and discovered that
lower humidity did not significantly increase the risk of damage from ESD, as long as proper
grounding was used when servicing IT equipment. It is expected that the 2016 edition of the Thermal
Guidelines will expand the allowable low humidity level down to 8%RH. This will allow a substantial
energy saving, by avoiding the need to use humidification systems to raise humidity unnecessarily.

*NEBS Footnote: NEBS (previous known as Network Equipment-Building System) is currently owned
and maintained by Telecorida which was formerly known as Bell Communications Research, Inc. or
Bellcore. It was the telecommunication research and development company created as part of the
break-up of the American Telephone and Telegraph Company (AT&T).

Next week we will explore controlling supply and IT air intake temperatures. If you prefer you can
download the Data Center Frontier Special Report on Data Center Cooling Standards in PDF format
from the Data Center Frontier White Paper Library courtesy of Compass Data Centers. Click here for
a copy of the report.

Most equipment is rated for a wide range of humidity (5 to 95% non-condensing, for instance).

However, what is the ideal humidity? Higher humidity carries heat away from equipment a little
better, but may also be more corrosive, for instance.

I've always heard 40%, though I can't back that up. I will say though that you need some humidity
to reduce static electricity build up.

EDIT:

Ah, I found my documentation, good old Sun Microsystems Part No. 805-5863-13, "Sun Microsystems
Data Center Site Planning Guide: Data Centers Best Practices"

Temperature and relative humidity conditions should be maintained at levels that allow for the
greatest operational buffer in case of environmental support equipment down-time. The goal levels
for the computer room should be determined in a manner that will achieve the greatest operational
buffer and the least possibility of negative influence. The specific hardware design, room
configuration, environmental support equipment design and other influencing factors should be taken
into consideration when determining the specific relative humidity control appropriate for a particular
room. Psychrometrics can affect hardware through thermal influences, Electrostatic Discharge (ESD),
and increases in environmental corrosivity.

And:

Under most circumstances, air conditioners should be set at 72 F (22 C) with a sensitivity range of
+/- 2 F (+/-1 C). Humidifiers, in most cases, should be set at 48% RH with a sensitivity range of
+/- 3% RH. The set-points of the air conditioners should always be chosen in an effort to maintain
the optimal recommended temperature and relative humidity levels for the room environment. These
set points should maintain appropriate conditions, while allowing wide enough sensitivity ranges to
help avoid frequent cycling of the units. While these tight ranges would be difficult to maintain in a
loosely controlled office environment, they should be easily attained in a controlled data center.

Numerous factors, such as heat-load and vapor barrier integrity, will influence the actual set-points. If
the room lacks adequate vapor barrier protection, for instance, it may be necessary to adjust
humidifier set points to accommodate seasonal influences. Ideally, all inappropriate influences on the
data center environment will be eliminated, but in the event that they are not, minor adjustments,
made by trained personnel, can help alleviate their effects on the environment.

And on Electrostatic Discharge:

The maintenance of appropriate relative humidity levels is probably the most universal and easiest
means of addressing ESD concerns. Appropriate moisture levels will help ease the dissipation of
charges, lessening the likelihood of catastrophic failures. The following chart illustrates the effect
moisture levels can have on electrostatic charge generation. Note Source Simco, A Basic Guide to
an ESD Control Program for Electronics Manufacturers

TABLE 6-3 Electrostatic Voltage At Workstations

Static Voltage

Means Of Static Generation Relative Humidity 10-20% Relative Humidity 65-90%

Walking Across Carpet 35,000 1,500

Walking over vinyl floor 12,000 250

Worker at bench 6,000 100

Vinyl envelopes for work instructions 7,000 600

Common polly bag picked up from bench 20,000 1,200

Work chair padded with urethane foam 18,000 1,500


AN UPDATED LOOK AT RECOMMENDED DATA CENTER TEMPERATURE AND HUMIDITY

In late 2005, we published an article outlining recommended data center temperature and
humidity levels. Over the years, that article has been one of our most popular pages and has
helped thousands of customers establish, monitor and maintain appropriate data center environment
conditions. Since that article was written, ASHRAE has updated its recommendations for data center
temperature and humidity.

Given the newly updated and revised ranges, we wanted to update our recommendations to reflect
the temperature and humidity guidelines you should be following and monitoring in your data
centers. Room Alert provides multiple ways to proactively monitor these environment conditions to
make sure your most important resources and assets
are always protected. Gartner Research estimates the
cost of one hour of downtime at around $5,600
per minute thats $336,000 per hour. Dont allow
temperature, humidity or other environment conditions
cost your company hundreds of thousands of dollars in
lost revenue!

Recommended Computer Room Temperature

Server rooms and data centers contain a mix of both hot and cool air server fans push out hot air
while running, while air conditioning and other cooling systems bring in cool air to counteract all the
hot exhaust air. Maintaining the right balance between hot and cool air has always been foremost in
maintaining data center uptime. If a data center gets too hot, equipment runs a higher risk of failure.
That failure often leads to downtime, lost data, and lost revenue.

When our article was first published in 2005, the recommended data center temperature range was
68 to 75F (20 to 24C). This is the range that the American Society of Heating, Refrigerating and
Air-Conditioning Engineers (ASHRAE) advised was optimal for maximum uptime and hardware life.
This range allowed for best usage and provided enough buffer room in the event of an air
conditioning failure.

Since 2005, newer standards and better equipment have become available, as have improved
tolerances for higher temperature ranges. ASHRAE has in fact now recommended an acceptable
operating temperature range of 64 to 81F (18 to 27C).

Keeping in line with those revised temperatures, server manufacturer Dell states that the
temperature sweet spot for their servers is
80F. Higher allowable temperatures mean data
centers and companies with dedicated server rooms
dont need to cool them as much as they used to.
This helps conserve both power and money in many
instances.

However, its important to keep in mind that higher


standard operating temperatures mean less time to
react when it comes to rapidly escalating temps if a
cooling unit breaks down. A data center filled with
servers operating at higher temperatures runs the risk of quickly hitting hardware failure in that
instance. These newer ASHRAE regulations make it even more crucial that any data center or
business with a server room proactively monitors its environment conditions. The higher the
temperatures, the more risk of server failure and data loss, and the more important to have proactive
monitoring equipment in place that can quickly notify you when an environment issue occurs.

Room Alert helps provide that proactive monitoring by sending alerts immediately when a
temperature or other environment threshold is reached. We recommend placing Room Alert at the
top of each server rack for optimal temperature readings. Digital temperature sensors should be
placed every 25 feet (closer if appliances such as blade servers, which generate more heat, are
used). By proactively monitoring your environment, any critical temperature threshold that is
breached will result in immediate notification. This will allow you to address the temperature issue as
soon as possible.

Recommended Computer Room Humidity

To quote from our original article, Relative humidity (RH) is defined as the amount of moisture in the
air at a given temperature in relation to the maximum amount of moisture the air could hold at the
same temperature. In a data center or computer room, maintaining ambient relative humidity levels
between 45% and 55% is recommended for optimal performance and reliability.

ASHRAEs newer 2016 guidelines remain relatively the same with data center humidity, with a
recommendation of 50% humidity. Minimum humidity is 20%, while maximum humidity is 80%.

Ambient cooling always creates humidity in the air of a


data center; its very important to make sure the humidity
stays in the recommended range. If humidity is too low,
the dry air will lead to electrostatic discharge (ESD) which
can damage critical server components. Too much
humidity will cause condensation, leading to hardware
corrosion and equipment failure.

Our recommendations on humidity alerts remain the same as they were in 2005. Early warning
thresholds of 40% and 60% relative humidity should trigger alerts. Critical alerts should be sent if
relative humidity reaches either 30% or 70%.

Our Digital Temperature & Humidity Sensor easily monitors both temperature and humidity with
superior accuracy; these sensors are available in 25, 50, and 100 lengths and are compatible with all
Room Alerts.

It is absolutely critical that data center temperature and humidity is monitored closely to provide for
both maximum uptime and maximum hardware life. If your company is not proactively monitoring
both temperature and humidity in your data center or server room, please contact us today. One of
our dedicated product specialists can help determine which Room Alert would be best for you. With
nearly 30 years of experience, and products in 181 countries, our experience will help make sure your
data center and critical facilities are protected against negative environment factors.

Contact us today at 401.628.1600, toll free 888.220.6700, by email Sales@AVTECH.com, or


through our online chat. Dont wait until its too late! Remember, Proactive Monitoring is always
better than Disaster Recovery.

You might also like