Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

Proper Datacenter Airflow...

By AnalogBoy in Technology
Thu Jul 18, 2002 at 11:34:21 AM EST
Tags: Help! (Ask Kuro5hin) (all tags)
Help! (Ask Kuro5hin)

My company has just embarked on building a larger datacenter.  One of the problems we're about to face is airflow.  In the past, in our server room, airflow and cooling have been a major issue.  

Most of our racks have intake fans at the top (okay, I guess, since we have high-intensity AC at the top of the room) and shelves that we have set deskside systems (Sun 250's, 10's, 30's, 450's, Old Proliant Servers, Etc) on.   There are no perforations on the doors - the machines are actually in an isolated heat bath.

The only relief is the air coming in from the top of these racks, which is, of course, less than efficient, as this colder air is usually warm by the time it reaches (if it ever properly reaches) the bottom of the rack.  On my racks, I have removed the doors and directed one of our portable AC/Fan units to blow squarely down the center of the row about 2 feet from the face of the server, allowing each server, as far as I can tell, to each have access to the stream of cold air.      

Unfortunately, as this hot air is expulsed through the back of one row of racks, the other racks, which are about 4 feet away, get a lot of the "processed" hot air from the initial rack.   Most of these portable AC units blow air directly at one set of servers.  Usually at the back of said servers.  (hey, I didn't design it).  

What I'd like to know what the communities best experiences were in cooling datacenters on a budget and with limited space, and address concerns such as:

  1.     static buildup due to airflow
  2.     condensation
  3.     appropriate ambient and component temperatures
  4.     Server spacing for proper airflow & thermal transfer (blades and high-density servers, especially)
  5. Is it better to cool the ambient room temperature, or the servers, with the direct airflow
  6.     Is airflow from the celing, or forced under the floor, better?
  7.     Given the two options, should the rack flow from the bottom up or top down?

This is as much for my education as anything else.


Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure


Best cooling..
o From top rack fans to floor 2%
o AC forced from floor into rack 73%
o AC in Front to back (duh) 23%

Votes: 42
Results | Other Polls

Related Links
o Also by AnalogBoy

Display: Sort:
Proper Datacenter Airflow... | 69 comments (49 topical, 20 editorial, 1 hidden)
There are airconditioning specialists (4.53 / 15) (#3)
by djmann88 on Wed Jul 17, 2002 at 05:27:48 PM EST

There are industrial airconditioning specialists (qualified mechanical engineers) who would design a system for between $3000-15000, (the highend being for around 10-20 kw of heat loss).

This is a well established and understood field, and as a electrical engineer or computer scientist, you cant really be expected to understand the design issues involved.(no offense intended, but afterall you didnt study mechanical engineering, did you.)

The main issue is piping heat outside of the datacenter, ideally your ambient air tempreture should be around 5C/45F, which is relatively easy to achieve. Airconditioning should remove most of the moisture from the air, and normally no static build up should occur unless something is seriously amiss.

As heat rises, flow should be from the bottom of a rack upwards.Ambient cooling is far more effective, reliable, cheaper, while fans provide no real benifit but only serve to circulate air (unless it is on aheat sink). It is easy to set up air currents in a room with fans etc.

I cant really comment more without seeing the room, sorry.

Raised Floor (4.22 / 9) (#7)
by Bad Harmony on Wed Jul 17, 2002 at 05:45:46 PM EST

Every place that I have worked at has used a raised floor for rooms with high equipment density. Chilled air is used to pressurise the space under the floor tiles. The air flows up through a set of louvers at the bottom of the rack.

5440' or Fight!

Air Flow...done this before (4.72 / 18) (#10)
by libertine on Wed Jul 17, 2002 at 05:47:56 PM EST

After working for a data center company, and working for a company who had several sites at different DCs, we found out several things:

Datacenters with low ceilings are a bad thing.  Basically, no matter how much cooling you have in a crowded DC, there will always be a thermal layer depending on how many boxes are on the floor.  There should be a 13 foot clearance or so between the tops of the racks and the ceiling.  This is to allow for large amounts of heat dissipation which cannot simply be managed by forced air cooling in a shared DC environment.  This can work in a privately owned environment, but that is because you don't make your pay based on how many racks you can squeeze into there (the data center companies do make their pay based on this).  Make sure that you will get what you need from a DC, and visit during the hottest day of the year at about 2pm - insist on going to the building where your equipment WILL be housed, and not just to their showroom site.  Take an IR temp gauge with you, if you have one (but ALWAYS ask for theirs after you arrive; if the NOC doesn't have one handy, that is a bad sign).  Ask for a step ladder, and use your hand to check the temp about 7 to 9 feet up.  This is where some of your boxes will be housed in a rack mount, so you should know how hot it is up there.  If their sales staff don't help you in this, then leave.

Cold air flow needs to come out beneath this thermal layer.  If it does not, you get a lot of warm pockets and other areas that don't cool properly.  That being said, raised tile cooling in a shared DC environment is a big fat waste of time.  Raised floors are useful if you can get under all the tiles and make sure dust bunnies and other crap aren't building up down there.  In a shared DC floor, people put all kinds of shit under the tiles (soda cans, old food, you wouldn't believe what else) and that blows everywhere with the forced air from underneath; nobody can ever get under all of them for cleaning, and forced air in that situation is of a limited value.  Forced air from below is really only useful for EMCs and the like, since they are designed to optimize a forced air from the floor environment.

Even if you do find a place that meets your cooling needs, you may need some floor fans that will push some of the air where you need it.  This is because air flow from cages near you will affect your setup, even if it was done right.  There will almost always be a hot machine somewhere in your setup.  Or, worse, if you get a new neighbor, and they are a major ISP- then their  system fans might be blowing into your cage, and it will take about a month to get your new A/C issue worked out (once you discover it).

Now, back to my weekend away from my mindless non-engineering job - I have orcs to go kill.

"Live for lust. Lust for life."

Ack! More to add...I misread article. (4.83 / 6) (#15)
by libertine on Wed Jul 17, 2002 at 07:01:26 PM EST

You are building your own DC.  My bad.

Ok, if you are working in your own private environment, I would say that force air from the floor is a good idea if you are using closed cases.  That is what they are designed for.

However, I tend to really dislike closed cases.  This is a personal preference, but it comes down to how difficult it can be to maneuver around inside a closed case when systems work needs to get done.  Another thing, is that most of the systems you mention seem to use back venting for their own cooling needs.  Having open racks makes heat dissipation less of an issue, provided that you can get the right kind of air flow.  Again this is probably a matter of preference and is probably going to be based on 1) what your A/C consultant says and 2) how often you need to perform maintenance on systems.

Ok, now I really am off to whack ogres.

"Live for lust. Lust for life."
[ Parent ]

Datacentre airflow (5.00 / 8) (#22)
by Maclir on Wed Jul 17, 2002 at 09:35:55 PM EST

Some suggestions:
  1. Most important - install a false floor.
  2. Have the A/C units pump the cool air to the underfloor areas. This means your A/C is not fighting the natural convection air flow.
  3. Make sure the cables running in the underfloor area do not block airflow to each rack.
  4. Remove the tile(s) under each rack.
  5. Make sure there are extraction fans at the top of each rack.
  6. Make sure there is sufficient space between the top of the racks and the ceiling, to allow the hot air to flow back to the A/C units.
  7. Get BTU ratings for all of your equipment. Add this up. Make sure the A/C has adequate cooling to handle all the equipment with capacity to spare.
  8. Go for multiple lower rated units rather than a single big mofo A/C. Redundancy.
  9. If you have installed a UPS and battery backup, ask yourself - if the mains power dies, what happens to the A/C?
  10. Fire protection - all A/C units should have an emergency trip on their supply breakers to shut them down when the fire detectors go off. You should also kill power to all computing equipment as well.

Perfect (3.00 / 1) (#28)
by MicroBerto on Thu Jul 18, 2002 at 09:13:04 AM EST

This is all you need to know. I am in a billion dollar data center as I type this, and trust me, tehre is a ton of machinery in here, and it is very cold.

While you have a false floor and air conditioning coming from under it, you might as well also add a C02 system for fire protection. Might save you millions and millions of dollars if it can put out a fire very quickly.

We also have power coming from 2 different power providers, and tons of backup anyway. If we go down, this business stops and we lose millions of dollars in no time at all.

Also, I don't know what is meant by "remove the tiles below the rack" - then the rack will fall into the true floor! :) But perhaps have tiles with many medium-sized holes so that you can pump that air up into it.

- GAIM: MicroBerto
Bertoline - My comic strip
[ Parent ]

Data Center A/C (4.66 / 3) (#33)
by Maclir on Thu Jul 18, 2002 at 10:21:49 AM EST

A clarification - the "remove the tiles below the rack" meant that there should be a large cutout corresponding to almost the size of the rack - so that there is nothing blocking the flow of cold air into the rack. Of course, the edges of the rack still have to sit on tiles.

I would use a water sprinkler system rather than CO2. "What?", you say, "are you on crack?" Water and electrical equipment is bad news! Here are several reasons:

  1. CO2 is under great pressure when it is in the bottles, and when released, is VERY COLD. Imaging the heat stress on warm equipment when a gas - at about -40C hits it. Expect many failures, particularly of disks.
  2. Spare a though for the people in the room - CO2 is not breathable (though not as bad as the now outlawed Halon). While water will get your people wet, they can still be in the room, maybe using handheld extinguishers on the seat of the fire, and so on.

To prevent the unfortunate mixing of water and electricity, the fire alarm system (which detects a fire and any flow of water in the sprinkler system) must have a shutdown control that automatically kills all power to the room. Lights, A/C, equipment. So also have emergency lighting.

Finally, an actual report on water on computers. In the late 80's and early 90's, I worked for IBM in Australia. Our branch office had a demonstration area on the first floor, containing an AS/400, several printers and terminals, PCs and so on. On afternoon the A/C unit the provided supplemental cooling to that equipment was in the ceiling space directly above the AS/400. One afternoon a cooling water pipe broke, and gallons and gallons of water with green coolant additives flooded the area - includig a rack mounted AS/400. The equipment was powered on and operating when it was flooded. 24 hours later, after a through cleaning by the engineers, everything was powered on and came up faultlessly.

[ Parent ]

what! HALON is outlawed??? <N/T> (none / 0) (#37)
by techwolf on Thu Jul 18, 2002 at 12:44:10 PM EST

"The strongest reason for the people to retain the right to keep and bear arms is, as a last resort, to protect themselves against tyranny in government." - Thomas Jefferson
[ Parent ]
Halon (none / 0) (#38)
by questionlp on Thu Jul 18, 2002 at 01:03:38 PM EST

IIRC - Existing halon installations are still "legal"... I think it's new installations of halon are not allowed. I could be wrong.
-- http://closedsrc.org
[ Parent ]
By existing do you mean (none / 0) (#45)
by squinky on Thu Jul 18, 2002 at 02:47:38 PM EST

the Bradley fighting vehicle?

That's where I first heard of the dangers of Halon. Bunch of guys-- trapped in a metal box full of halon...

Apparently it's still in there, but I couldn't find a date for this doc

[ Parent ]
Halon (3.00 / 2) (#39)
by Maclir on Thu Jul 18, 2002 at 01:21:07 PM EST

Certainly, in Australia, the use of Halon gas as a fire fighting system was banned at least 10 or 12 years ago. While existing installations were permitted to remain, I understand such gas systems do not stay effective forever.

The main reason for the ban was the well documented adverse effect CFC's have on teh ozone layer. However, Halon is VERY unfriendly to people - one large data centre had a big halon flooding system, with big override buttons at the system consoles, and the exit doors. The senior operators / shift supervisors were to hit the haln override to stop the gas release (alarms would go off for 15 seconds before release), then once everyone was out of the room, hit the override again at the exit doors to start the gas release.

I witnessed another test of a big halon system at a large data centre in the mid-80's - we watched from outside, through glass windows as the supplier of the fire system hit the big red button. There was a huge explosive noise - and when the clouds of gas cleared, the computer room (thankfully, without any equipment in it yet) was destroyed - all the false floor tiles were scattered around the room, and the remnants of teh suspended false ceiling tiles and lights were in twisted piles on the floor. Guess the pressure must have been a bit too much.

[ Parent ]

Halon. (none / 0) (#59)
by FeersumAsura on Fri Jul 19, 2002 at 11:54:38 AM EST

The Control room at a Barrow power station uses Halon in it's safety system. Everything else is CO2

I'm so pre-emptive I'd nuke America to save time.
[ Parent ]
Yep. (none / 0) (#43)
by AnalogBoy on Thu Jul 18, 2002 at 02:28:27 PM EST

Halon is illegal in most, if not all, the USA.  It may even be federal.    Most of the replacement compounds are fluoroiodocarbons - a molecule that has flourine, iodine, and carbon (sometimes hydrogen)
Save the environment, plant a Bush back in Texas.
Religous Tolerance (And click a banner while you're there)
[ Parent ]
I think i was wrong.. (none / 0) (#44)
by AnalogBoy on Thu Jul 18, 2002 at 02:31:09 PM EST

I think i don't know what i'm talking about and should shut up. :)

Save the environment, plant a Bush back in Texas.
Religous Tolerance (And click a banner while you're there)
[ Parent ]
Um...its not halon or CO2 anymore (none / 0) (#41)
by libertine on Thu Jul 18, 2002 at 02:17:34 PM EST

They actually use a modified form of butane now for this type of stuff.  It is called FM-200.

It just displaces enough O2 quickly enough to douse flames, while still leaving enough air to breathe.  I have personal experiences with what I just said, and it does work.  I and my friends are still among the living, so I guess it didn't kill us either.

A shot of that stuff, at the speed it comes out, will knock your glasses clean across the room and you on your ass.  But the fire will be gone, or at least at a manageable level for firefighters to deal with it when they arrive.

"Live for lust. Lust for life."
[ Parent ]

Firefighting (none / 0) (#57)
by Znork on Fri Jul 19, 2002 at 05:46:34 AM EST

Actually, we're looking at a mist/fog extinguishing system for the moment. It has the advantage of not releasing close to as much water, and you wont get people killed by it like CO2.

[ Parent ]
Try a preaction system (none / 0) (#64)
by sowellfan on Tue Jul 23, 2002 at 12:56:36 AM EST

In my experience (as a mech. engineer designing A/C systems & a couple fire protection systems), FM-200 or the equivalent would probably be the best for putting out a fire and not damaging equipment. I think there is another gaseous fire suppression agent out there right now, but I forget the name. In both cases, though, I believe that once you have a discharge, it's expensive to replace the canisters, or to get them refilled. I also believe that the current gaseous fire extinguishing systems won't kill people.

If you must go with water sprinklers, try a preaction system for your data center. It might also be called a double-action preaction system (haven't messed with this in a while, and I'm at home, where I don't have my catalogs). Essentially, in a typical sprinkler system, if one sprinkler goes, sprinkling begins (even if it's some disgruntled worker with a barbecue lighter). In a preaction system, there is a valve between the main line and the sprinklers. The pipe that goes to the sprinklers in the data center would be pressurized by gas. If the fusible link on a sprinkler lets go, the gas is released, and the preaction system control box detects this. After this, I think a second type of smoke/fire detector has to alarm before the valve opens up, sending water to the open sprinkler. Essentially it requires that two separate events happen before water starts to spray. In fact, there may even be options for systems where two fire/smoke alarms located in separate areas have to alarm before the valve will open. Typically, all sprinklers will have a fusible link, and only those sprinklers that get hot enough to melt that link will spray water (there are systems where the sprinklers are open, and when the valve upstream opens, they all spray...this is called a deluge system).

Secondly, if you go with underfloor air distribution, the cables will need to be plenum rated, I believe. This essentially means that, if there is a fire in the area under the floor (called a supply plenum), the cables won't release toxic gases as they start to melt and burn.

[ Parent ]

Safety? (none / 0) (#68)
by Znork on Wed Jul 31, 2002 at 09:24:47 AM EST

Is the air in a room being extinguished with FM-200 breathable? We've got CO2 extinguishing in one room, but, well, while CO2 wont kill you, the lack of oxygen will. If it's a real fire you'll probably be out the door before the system goes off, but if its a system malfunction... which is why we have to turn the CO2 system off before working below the floor there. How long can you survive in a FM-200 based extinguishing?

For the sprinkling systems double-action is indeed a good idea, and we have that. But, well, major amounts of water isnt a good idea with computer systems, and reducing it with fog based extinguishing would probably be better.

[ Parent ]

I believe that FM200 is not life-threatening... (none / 0) (#69)
by sowellfan on Thu Aug 01, 2002 at 12:57:41 AM EST

Ok, my experience is limited, but I believe
that FM200 can discharge into an occupied space.
At least, that is what the http://www.fm200.com/
website says. They also have the MSDS sheets there.

[ Parent ]
Warm Datacenter (4.00 / 1) (#40)
by xrayspx on Thu Jul 18, 2002 at 01:37:45 PM EST

Lately when I'm in our hosting facility (Exodus BOS2), which is way too often, I'm noticing that it is being run MUCH MUCH warmer than it was pre- C&W. This is not my imagination.

I will tend to do very long projects in our cage, and in the past it was so cold that I couldn't even type well after a couple of hours. Uncomfortably cold. Now, I'm in there with shorts and a t-shirt for as long as I like, no problem.

Since Exodus is the first (and only that I know of) company to SPEND a billion on a datacenter, I imagine you might have some knowledge as to the current warming going on in the DCs?

I still think it's "cold enough", at least in our sparsely populated area, but some cages, VA Linux as an example, are belching out some damn hot air, and it's very very warm in their vacinity. Couple others have just huge storage arrays that are puking lots of heat out.

I assumed it was due to the fact that the datacenter is getting less populated and fast. But with all the web-ops leaving, I'm seeing a lot of non-web related stuff going in.

"I see one maggot, it all gets thrown away" -- My Wife
[ Parent ]
I made the Exodus from, uh....yeah. (5.00 / 3) (#46)
by libertine on Thu Jul 18, 2002 at 02:52:03 PM EST

I used to be employee one-hundred-something there, when they were based only in Santa Clara, and their main customer was some guy with a pile of 7 PCs trying to start a company called Hotmail, and their only datacenter was twice the size of my frontroom.  heheheh.

You might want to read my earlier comment about shared DC space.  Nothing changes your mind about the value of raised flooring when you have an inch of water under your tiles.  And some datacenters run POWER under the flooring, rather than above.  Actually, both should probably be run above, which can be done easily.  Think about it- 4 hours with your head and back twisted and shoved under some tiles for maint, or sitting on a ladder.  I'll pick the ladder.

Another thing, in dealing with them (this is speaking as a one time customer now)- their company is ENTIRELY sales based, NOT service based.  This means that they will tell you anything to get you into the door, and their commitment to your happiness ends once they have your cash.  This happens again when contract comes up for renewal.  So, keep very reliable notes on temp and such.  Memorize your SLA, keep notes on it posted on your cube.  And get to know the DC manager personally- take him/her and the electrician's manager out to lunch regularly (they can get things done that sales may be too, um, laz^H^H^H^H, er, working to drive to completion on a different time table).

When you renegotiate your contract, be sure that you put an ambient temp into the contract that is really low, and means that they WILL owe you comp time if they don't meet it.  I did this with the last company I worked for, and we had about 6 months of free service (this also meant installing a bunch of networked thermometers, big whoop).  Every summer meant another 3 months of free service, and they had to pay a different customer to move their output fans away from the side of our cage in order to meet their SLA with us.  When we renegotiated, they lowered our rate BELOW what we were paying for pro-rated service, just so that they wouldn't have a black eye anymore on all of the cooling issues.  That is almost a year of free service in a year and a half, and a discount.  I love it.

Why all that free service, rather than move another customer's equipment around, or get better A/C into our area?  Easy.  Sales.  Sales did not want to piss off the customer who had been moved in next to us, blowing the hot air, because that customer had not paid yet (actually, they NEVER paid, which we also used in negotiation in terms of calling them on their timing for fulfillment).  So no move.  Sales did not want to piss off the very huge content provider on the same grid as our A/C, who paid much more than we did, so it took them MONTHS to get around to even planning an A/C expansion, which would have meant a temporary outage for the large service provider.  Of course, that provider was in a different section of the building- their area was always icy cold.

"Live for lust. Lust for life."
[ Parent ]

Entry Into Exodus (4.00 / 1) (#52)
by xrayspx on Thu Jul 18, 2002 at 05:58:25 PM EST

Heheh, I was complaining the other day on our Intranet Operations Log site (scoop based, yay me), that it took me 40 minutes to get into Exodus, then in my tiredness digressed into a whole 'Enter into Exodus' paragraph.

The Temperature based SLA is a good plan, thanks. They have just started charging for server reboots, which had previously been covered in our agreement, among other things.

We also wanted memory upgrades to our F5 BigIPs, leased from Exodus. They didn't wanna touch them, so I ordered the RAM, did the install, to find that you have to enter the amount of RAM into sysctl in bytes to get it recognized, thanks BSDi. F5 didn't want to deal with me because I wasn't a customer, Exodus didn't do it because I broke terms of the lease that they told me to break... AGH. And I didn't wanna do it because I didn't want to screw it up and have it be on my head.

At least I got to see how big a POS those BigIPs really are inside. A $35,000 commodity PC, Asus mobo, $4 NICs, with a single IDE hard drive (WD no less), and a massive perl script. Yay.

"I see one maggot, it all gets thrown away" -- My Wife
[ Parent ]
Heheheheh.... (4.50 / 2) (#54)
by libertine on Thu Jul 18, 2002 at 06:31:14 PM EST

Yep.  Exodus all the way.  Just make sure you keep a record of everything they ask you to do- make sure they send an email, with their name, etc. on it.  Review EVERY SLA and contract you make with them, and find the loopholes- they will surely use them against you if they can.  Also, they have webbased ticket access...and some real clowns handling the ticket updates, so you can get even more grist for your mill from that.  Lastly, their bandwidth monitoring can be abused to your advantage, if you know what you are doing.

If you think F5s are crap, try tearing apart a Pix sometime.  You can make a faster and better firewall from OpenBSD and your desktop.  Their starter box is a 133 with a governored ISA card for the inbound interface.  No shit, seriously.  The only advantage to their setup is that they have worked out most of the MIBs and the console boot seq for you, and that the setup is diskless.  Not much for what you pay, really, since you could hire a consultant to handle the snmp related coding and build two boxes of your own for the same price (with PIIs or PIIIs, no less).

"Live for lust. Lust for life."
[ Parent ]

Plus they're heavy (none / 0) (#55)
by xrayspx on Thu Jul 18, 2002 at 06:41:14 PM EST

ouch .

I don't have any REAL problem with the F5's, and I like my PIX's, when they're nice to me.

"I see one maggot, it all gets thrown away" -- My Wife
[ Parent ]
pixs and local directors (none / 0) (#60)
by el_guapo on Fri Jul 19, 2002 at 02:19:08 PM EST

same thing - an intel pc with flash ram for a HDD. (we've actually turned them from one to the other here at work). you're paying for the software and the experience of cisco's security folk i think. i actually have a pix at home and i love it, a very overpriced p5-200MMX indeed, but i like the security it offers (and work paid for it :-). shit, even nokia's are intel boxen, they just really disguise it from the outside (ipso == some bsd variant iirc), and nokia's seem WAY more overpriced than pixs to me anyways...
mas cerveza, por favor mirrors, manifestos, etc.
[ Parent ]
#1 (4.00 / 1) (#29)
by duxup on Thu Jul 18, 2002 at 09:29:28 AM EST

Great list.

I'd just like to throw in my support for #1.  No large datacenter can operate properly without a raised floor.  It is also important to make sure that you can access all (or at least most) of that floor from directly above it.  Raised floors make life so much easier.  I wish my house had raised floors.

[ Parent ]

Yep. (3.66 / 3) (#30)
by jabber on Thu Jul 18, 2002 at 09:46:00 AM EST

And make sure you also have humidity control. You don't want condensation, but you also do not want the place too dry. Moisture will help control static.

[TINK5C] |"Is K5 my kapusta intellectual teddy bear?"| "Yes"
[ Parent ]

Go with Liebert units if possible (none / 0) (#65)
by sowellfan on Tue Jul 23, 2002 at 01:28:24 AM EST

Go to http://www.liebert.com

Liebert units are pretty much the premier units that I know of for high density computer applications. They can be had with all the options one could want for humidity control in a space (as someone mentioned before, getting a room too cold could cause static electricity problems). I want to say that you want 45-55% relative humidity, but if you've got the room at 50 degrees F, even 90% relative humidity might be dry enough to cause static electricity problems. They also can be easily had specifically arranged for discharging air straight down into the underfloor plenum.

The underfloor plenum idea is a good one from my perspective (I design A/C systems for a living, though I've not worked on any major data centers). As far as where to run your cabling, I guess that's a matter of preference. As I stated in another post, if you run any wiring in the supply plenum, it needs to be plenum rated.

The posters who suggested redundant systems had good points. If you have a continuous underfloor system (i.e. units don't have independent air channels under the floor), you could just have an extra air handler sitting around, hooked up and ready to go. Better yet, if you go this route you probably want to alternate their use, so that one doesn't just sit and corrode (or have their internal fluids separate, etc.). Only problem is, that option is expensive. One other option that you can do is to have multiple condensing units on the outside. Say, if you have a 10-ton air handler (with two refrigerant circuits), you have two 5-ton condensing units on the outside. Therefore, if one condensing unit dies, you at least still have half capacity. Of course, if the air handler dies, you're in bad shape, but the main operative parts of an air handler are the fan (if motor or belt goes out, replace it with a new one) and the evaporator coils (have a workman come in and patch it, hopefully).

One thing that you *must not ignore* is condensate drainage. Liebert units can be had with an optional water detector cable, I believe. This cable essentially is layed on the floor around the unit (or the subfloor at the bottom of the supply plenum), and if water gets on it, some sort of alarm sounds and the unit shuts down. Ideally, you'll have a primary condensate drain to some appropriate place, and you can have a secondary drain just in case the primary gets clogged. There are multiple options for doing this (if you've got questions, just reply to this post). The water detector acts as a failsafe, but it is very nice to have. In lieu of the secondary condensate drainage, and even possibly the water detector cables, is a water detector that clips onto the edge of the drain pan, above the level where water should normally be. If the condensate gets clogged and starts to overflow, this thing trips, shuts off the unit, and has the added feature of having a little wall mounted labeled LED that tells you that there is a condensate overflow.

Well, hope this is of some assistance. BTW, if none of these options work, you can always try

[ Parent ]

Addon... (5.00 / 2) (#42)
by dmalloc on Thu Jul 18, 2002 at 02:20:31 PM EST

> 1. Most important - install a false floor.
I can only agree with that. Installing a false floor and false ceiling is what we chose to do, it does help to coordinate and direct the airflow better. Furthermore, as mentione din posts below, make sure, that you have a high ceiling, because the warmer air layers can drift further upwards that way.

> 5. Make sure there are extraction fans at the top of each rack.

I really do not know, how tight your budget is, but if it is at all possible buy racks, which can be closed. A rack that is closed and has powerful extraction fans at the very top creates the same effect a well drawing chimney has. If the lower plate of teh closed rack is missing, cool air from the ventilation below is ripped into the rack and the hot air is pushed upwards to be blown out of the rack by the extraction fans.

> 6. Make sure there is sufficient space between the top of the racks and the ceiling, to allow the hot air to flow back to the A/C units.

Once more, I do nto know your budget, but if you are able to afford it and you are handy with tools, you can attach fixed tubes, that lead up to the ceiling where your AC`s main tubes mit be running, same things as above, a chimney effect,

Hope this helps a bit as well.

[ Parent ]

number 11 (5.00 / 2) (#48)
by el_guapo on Thu Jul 18, 2002 at 04:39:28 PM EST

run your cabling overhead - you WILL eventually clog the underfloor with cables if the thing grows like ours do..
mas cerveza, por favor mirrors, manifestos, etc.
[ Parent ]
we had a tight budget (4.96 / 27) (#34)
by j1mmy on Thu Jul 18, 2002 at 11:00:05 AM EST

We had a really tight budget at our firm. Our data center had always been on the warm side, cooled by a number of floor fans scattered around the room. It worked, but could have been better.

We ordered a dozen new machines last month, all high-end multi-way Xeon boxes. It was going to take a while for our vendor to assemble and ship them, so we put off the cooling problem for a while. Then we started losing money.

The machines had been paid for. We were stuck with them. But the IT budget dried up almost overnight.

So here we were with a dozen beastly boxen that, based on our early tests, raised the temperature of our datacenter almost 15 degrees farenheit. Resolved to get things down to level than where they started, I started trolling junkyards for anything I could find. I ended up constructing a massive cooling system for the cost of renting a large truck to haul stuff around.

A local restraunt had dumped it's old freezers, most of which were still somewhat operational. These puppies had heat exchangers larger than the most obese tech on my staff (and that's pretty big). We got four of them in total, almost 100 square feet of cooling area.

Other junkyards yielded a large collection of fans and fan blades. We found more than enough to provide fannage to the heat exchangers.

All this stuff was a great find, but we still needed a way to get the air moving. Our datacenter had ducts to the building's cooling systems, but they were inadequate for our needs. Luckily, our datacenter was on the first floor, on the south edge of the building facing an undeveloped grassy lot. A couple hours of sledgehammering later, we had a more than adequate ventilation duct to the outside world. It was nice having a "window" in the server room for once, much less one large enough to allow two lanes of traffic.

My staff gathered all the tools they owned, and we went to work on the hole. As a hole, it was highly functional. It was definitely rough around the edges, however. Using all manner of scrap metal, we assembled a solid edging for it that would allow mounting of the heat exchangers. Those went up without too much trouble. I was originally worried about the remnants of the wall being able to support them, but one of my cleverer techs rigged up a set of trusses that supported both the wall (it was sagging at the upper end) and the heat exchangers.

All the fans went in without too much trouble. We had to be careful with the power cords. With condensation to one side and searing heat to the other, it could really be a recipe for disaster. We mounted some PVC piping in there to keep the cords in place. It's held up fine.

The cooling monstrosity that stuck out the side of our building was ready to go in less than a week. I purchased some bottles of fine champaigne for the start-up party. The countdown started at ten. At five, I hit the switch for the fans. They started humming nicely. At zero, the switch for the exchangers. They also started nicely. Within minutes, cool air had flooded the room and it was only getting colder. This was only one problem. The other was with the amount of heat we were pumping out the other side.

In all the construction frenzy, nobody had considered the notion of a temperature control. Within half an hour, the server room would have been freezing. We shut it down earlier, however. Just before the fire department showed up.

The lot outside our heat exchangers had actually caught fire. It's been relatively dry here this summer, so it's really not too surprising. Once the firemen had coated the sea of flaming grasses and underbrush in foam, one of them came over to expect our cooling system. He immediately labeled it as "dangerous," me as "irresponsible," and the entire project as "an exercise in stupidity."

The building owners (we lease the space) were notified of what we had done. It never occurred to me to ask permission to cause serious structural damage. I'm now in major trouble with upper management (they're in a different building across town. they had no idea this was going on). The city is considering a lawsuit against me personally for spearheading this project because it endagers lives or some bullshit. Even worse, an editorial in the paper the other day labeled me as an enemy of mother Earth for burning all that grass (and a few trees). Don't even get me started on the nearby preschool that was downwind of the fire.

In conclusion, find a professional to do it for you. Don't make the same mistakes I did.

BWAHAHAHAHAHA (4.50 / 4) (#35)
by tonyenkiducx on Thu Jul 18, 2002 at 11:17:04 AM EST

I dont care if thats true or not. Fucking hilarious :)

I see a planet where love is foremost, where war is none existant. A planet of peace, and a planet of understanding. I see a planet called utopia. And I see us invading that planet, because they'd never expect it
[ Parent ]
you sir are a TRUE DYED-IN-THE-WOOL (4.66 / 3) (#36)
by techwolf on Thu Jul 18, 2002 at 12:38:57 PM EST

SUPER GEEK! only a geek of the first circle would have thought of doing this and as another poster said true or not that shit was FUNNY! I mean goddamn supercoolers is what you had there!

do you have the specs handy? I think I would like to build one

"The strongest reason for the people to retain the right to keep and bear arms is, as a last resort, to protect themselves against tyranny in government." - Thomas Jefferson
[ Parent ]

Great Idea!! (none / 0) (#47)
by landryjf on Thu Jul 18, 2002 at 03:55:56 PM EST

Thanks for the good laugh!!

"The idea was great. It was just not planned well enough." - Me

[ Parent ]

hehe (none / 0) (#49)
by tps12 on Thu Jul 18, 2002 at 04:41:43 PM EST

What did your landlord have to say? :)

[ Parent ]
omg!! (3.00 / 1) (#50)
by el_guapo on Thu Jul 18, 2002 at 04:55:34 PM EST

you have GOT to put up pics of this thing!! unless you already have and then i want links!!!
mas cerveza, por favor mirrors, manifestos, etc.
[ Parent ]
Brilliant! (4.33 / 3) (#51)
by EngnrGuy on Thu Jul 18, 2002 at 05:42:02 PM EST

You, sir, have a promising career ahead of you on Junkyard Wars.

[ Parent ]
~Blinks!~ (3.50 / 2) (#56)
by Llyander on Fri Jul 19, 2002 at 03:25:36 AM EST

I bow to the superior Bodger. : ) Oh man. Seriously, if you've got pics of that, I'd love to see!

[ Parent ]
hahah (1.00 / 2) (#63)
by Angelic Upstart on Mon Jul 22, 2002 at 08:44:40 AM EST

oh my god if what you say you are a fucking moron. First you aren't a HEATING AND COOLING GUY! second you knocked a hole into the wall of your work. WITH OUT ASKING ANYONE. What made you get that bright idea? jesus if you worked for me and did all that you'd be fired so fast it'd make your head spin.

[ Parent ]
I wouldnt blame him.. (none / 0) (#67)
by tonyenkiducx on Fri Jul 26, 2002 at 09:08:15 AM EST

It sounds like a lesson in collective stupidity. Where no one person is stupid enough to do something, the groups collective stupidity comes together to form the worst plan possible.

I see a planet where love is foremost, where war is none existant. A planet of peace, and a planet of understanding. I see a planet called utopia. And I see us invading that planet, because they'd never expect it
[ Parent ]
Telcos don't like raised floors (3.00 / 1) (#53)
by Q2 on Thu Jul 18, 2002 at 06:09:25 PM EST

All the raised floor, AC issues, cable management problems can be solved with two simple design ideas. 18 foot ceilings, and ladder racks. Show me a Telco CO with raised floors and 8 foot ceilings, and I'll show you a telco that can't maintain 9 nines of reliability. "Since everyone else has raised floors, and locked cabnets, we need them too." You guys stick with your raised floors and nice looked locked cabnets. I'll have a datacenter that works.

Telco vs. Datacenter (5.00 / 1) (#58)
by Bios_Hakr on Fri Jul 19, 2002 at 10:45:16 AM EST

I have worked in both a telco and a datacenter.  By far, the datacenter had more equipmant.  Most telcos are set up for 1 or 2 major peices of equipment per rack.  A datacenter may have 20 or more computers in a single rack.

A talco is also a much more dynamic environment.  New equipment requires new cabling and most of the old stuff had to be removed.  Once the cat-v and fiber are run for a datacenter, you will rarely touch the cabling no matter what the equipment is.

That is why telcos like cable ladders above the racks and datacenters like sub-floor cabling.  Once you have a sub-floor, it only seems natural to pump cold AC into it.  Besides, datacenters need to look professional and finished.  Professional looks==more customers.  A telco needs to look dynamic for easy equipment installs.  Dynamic==more customers.

[ Parent ]

A good plan (none / 0) (#61)
by Vader82 on Fri Jul 19, 2002 at 10:00:55 PM EST

Everyone is saying raised floors.  Thats fine, if you can properly clean them.  A better bet would be to install raised floors and install sheetmetal ductwork into said raised floors.  That way you have well defined places for the air to flow, and places to run your cables( I assume thats what part of the benefit of raised floors is).

Lots of redundancy.  Instead of one big AC unit to push air into 4 different main ducts that feed cold air into your racks, get 4 smaller ones, but make sure that any 3 can cool the whole place well.

Have the ACs connected with essentially a bus line, so that if one goes down the vents it normally pushes to still get cooling.  Also, have plenty of ambient cooling for the room.  

While chilling the racks is good, chilling the rest of the room is also good.  What that will do is force the hot air to stay up by the ceiling, and up at the ceiling is where you will locate your hot air returns for the AC units that push air into the raised floor.

This is by no means a comprehensive guide, but it is a good starting point.  As one poster previously pointed out there are people who make a living knowing exactly how to do this stuff.  You should probably get in touch with them for the acutal design, as a good design will involve lots of stuff you aren't used to, like installing sheetmetal ductwork and working with HVAC units.  Good luck!
Need food? Like sharing? http://reciphp.vader82.net/

Airflow and Suns (none / 0) (#66)
by esjay on Tue Jul 23, 2002 at 09:00:11 AM EST

The equipment you've listed (I can't comment on the proliants) from Sun is all designed to have airflow running from front to rear. I've seen many datacentres who persist on putting 250/450s etc in racks with glass fronts on them, and you could cook an egg on the top of some of the boxes inside.

If you have a look at the racks sun sell to house their equipment (which I wouldn't suggest buying, due to the stupidly high costs for what you get), they all have front and rear doors that are basically big grilles. The best design I've seen for using these style racks in conjunction with underfloor aircon is to have rows of directional ducts running along the tile row in front of your racks. This means the air gets blown directly into the front of the racks, which is what the Suns really want.

At least Sun no longer cool their machines via vents on the sides of the machines. That was a mess.
Facts are meaningless, you can use facts to prove anything that's remotely true! Facts, schmacks. - Homer

Proper Datacenter Airflow... | 69 comments (49 topical, 20 editorial, 1 hidden)
Display: Sort:


All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!