Title: Datacenters: Current Resources
1DatacentersCurrent Resources Future Directions
- Where do you house your IT equipment?
Brynnen Owen Neil Thackeray (LIS), Chuck
Wallbaum (SCS), Gabe Gibson (Physics), Randy
Cetin Melissa Woo (CITES)
2Graduate School of Library and Information
Sciences
3We had four small spaces for servers scattered
around GSLIS
- Two were shared with CITES networking (SSHHH!).
- Two were parts of peoples offices.
- Connections between servers wasnt too easy.
LIS
4Not enough power
- The power was maxed out.
- Several stand-alone ups backup systems were used,
auto-shutdown not really available. - Installing a new server required shutting down an
old server.
LIS
5Not enough cooling
- We had Movin Cool units providing cooling. The
big one was 24000BTU. - Systems were on the top floor, adding to the heat
load. - During the summer, Heat outages were frequent
due to inadequate capacity.
LIS
6MORE MORE MORE MORE!
- We had a few faculty wanting to build computation
clusters. - We said fine, if they bought their own generators
and air conditioners and housed things in their
offices.
LIS
7Funding Available!
- With some grant money and some matching funds, we
had enough money to build a server room. - Approximate cost was 200,000.
LIS
8Overspec!!!!!!!
- We added up all of the power and cooling we were
currently using, added in the necessary power and
cooling for one of the clusters, and multiplied
by 2. - We should have multiplied by 4.
LIS
9Final server room spec
- 10 tons AC (2x5 ton units)
- 30KW Central UPS
- Alarm system
- No more Hot Day server vacations!
LIS
10We are about out.
- After having two clusters and a landslide of
research-funded new machines, we have about used
up the headroom. - No more power on the transformer outside, so
were REALLY out. - Currently, we have no policy for retiring servers.
LIS
11School of Chemical Sciences
- High Performance Computing Center(HPCC)
12High Performance Computing
- SCS had one small room with 8 tons of cooling,
140 processors - NCSA was decommissioning SGI Origin 2000
Supercomputers - SCS acquired four 128 Processor machines
- Total value in 1999 was 6 Mil
- Used value in 2004 was 1 Mil
- Major uses include Quantum Dynamics, Molecular
Dynamics, Molecular Modeling and Drug Discovery - Retirement of the Origin 2000s was the catalyst
to build the SCS HPCC
SCS
13SCS HPCC Construction
- 2000 Sq. Ft. total
- 42 tons of Computer Room Air Conditioning (CRAC).
Conventional DX cooling, chilled water
unavailable at the time - 402,000 Cost of Construction
- 425 Amps for CRAC
- 600 Amps for Computers
SCS
14SCS HPCC Construction
- Three Phases of Construction
- Phase I 1000 Sq. Ft. 20 ton CRAC
- 202,000 Demo and Construction Cost
- Completed August 2004
- Phase II added 1000 Sq. Ft. and 22 ton CRAC
- 200,000 Demo and Construction Cost
- Completed May 2006
- Phase III 30 tons Chilled Water Cooling
- Additional 800 Amps and CW now available
- Timeframe and Cost based on the next new hire
SCS
15HPCC Demolition
SCS
16HPCC Floor Plan
SCS
17HPCC Origin 2000
SCS
18Physics/Materials Research Laboratory
19Opportunity to work together
- MRL had datacenter space, badly in need of
updates (obsolete A/C, sagging raised floor) - Physics was out of power out of space, but had
for faculty start-up. - So
Physics/MRL
20Design Goals
- Efficient use of space
- Modular, build as we grow design for future
flexibility - We didnt want to oversize cooling/power in order
to save - Need to accommodate what a datacenter looks like
5/10 years out without having to guess now - Aesthetics matter
- Departments wanted a showcase for prospective
faculty, funding agencies, etc. - Need high speed networking between departments
- 10Gig uplink for LLP MRL to avoid bottlenecks
- Minimize time from hardware purchase to research
results
Physics/MRL
21Design Details
- 1500 sq.ft.
- 2x 20ton Liebert chilled water A/C units
- 200A existing power for A/C, new 600A for
computers - 30KW UPS (expandable to 80KW), 60KW unprotected
- 18 raised floor dropped ceiling for HVAC
- Standard Hot Aisle/Cold aisle layout
- Project cost 400k
Physics/MRL
22Physics/MRL
23Datacenter Layout
Physics/MRL
24Immediate Results
- If you build it, they will come
- 8 additional research clusters after completion
for 170 compute nodes total - Datacenter space is in high demand
- Requests from other groups departments
Physics/MRL
25Physics/MRL
26CITES
- DCL Datacenter Renovation
27CITES Datacenters
- 2 datacenters - primary and backup
- 24x7x365 monitoring of services, systems,
network, environmentals - secure access via prox-card
- CCTV monitored and recorded
CITES
28Primary Datacenter Specs
- 2 floors, each approx. 1500 gsf
- Raised floor
- Primary and backup cooling (three 30-ton, one
8-ton CRAC units one 30-ton AHU) chilled water
primary supply - UPSes used for conditioned power (two 150kVA)
CITES
29Backup Datacenter Specs
- 350 sf
- No raised floor
- 5-ton, R22, CRAC
- 24kVA UPS and backup generator
CITES
30DCL Datacenter Renovation
- Added basement floor, approx. 1500 gsf
- Redundancy designed to mimic 1st floor
- Power, cabling in ladder racks
- Two 30-ton CRAC units
- Higher weight capacity, closer to future
datacenter design than 1st floor
CITES
31CITES
32CITES
33CITES
34Questions?
- A/C Chilled water vs. Refrigeration
- Power
- Security
- Fire Suppression
- Build/Design/Construction