Is High Density Hosting Right For You - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Is High Density Hosting Right For You

Description:

19 facilities, over 550,000 sq ft fitted-out - including first dedicated high ... MTBF is less as the equipment is cooled and powered correctly at all times. ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 30
Provided by: pco95
Category:
Tags: density | high | hosting | mtbf | right

less

Transcript and Presenter's Notes

Title: Is High Density Hosting Right For You


1
Is High Density Hosting Right For You ? Paul
Court Technology Services Director
2
The TelecityRedbus group incorporating Globix
  • Leading provider of data centre and managed
    hosting services in Europe
  • 19 facilities, over 550,000 sq ft fitted-out -
    including first dedicated high-density hosting
    facilities in Europe (2004)
  • Comprehensive range of products and services
    including extensive security services experience
  • Over 2000 customers

3
The Problem
  • Ever increasing Power footprints of Datacentre
    equipment.
  • Ever increasing Heat footprints of Datacentre
    equipment (power in heat out).
  • Smaller equipment (more items per rack)
  • Datacentres that were designed years ago with 1
    to 4 Kw/rack.

4
The Challenge
  • Its no secret that the influx of higher density
    equipment (HDE) into large space Datacenter has
    become a challenge for some and a major headache
    for others
  • Most large space Datacenters have a design
    criteria that is far below that now required to
    accommodate HDE in any volume and are sacrificing
    useable space for increased cooling and power
    availability to localised areas within the
    Datacenter
  • Many clients have purchased HDE on the promise of
    vast savings on the cost associated with
    consolidation of systems, management and space.

5
The Main Culprit
Aperture Research Institute believes that one of
the reasons for the rise in power density is the
use of blade servers. Some 90 per cent of
respondents indicated that their companies had
blade servers in their data centres.
6
Power Density vs Rack
Source APC / MGE
7
Its All Hot Air !
  • 1 kilowatt of power consumption by a typical 1U
    server requires 75.5 litres of air per second.
    Blades require 49.6 l/s/kW.
  • The air must be capable of producing a 10 degree
    C temperature drop across the server.
  • Most servers can operate with inlet air of 29-32
    degrees C.
  • ASHRAE STANDARDS Thermal guidelines for Data
    Processing Environments TC 9.9
  • Allowable temperature range 15-32 degrees C.
  • Recommended range 20-25 degrees C.
  • Humidity range 20-80RH
  • Recommended range 40-55RH

A 24 KW Blade Server rack will require 1190
Litres of Air per Second
8
Its All Hot Air !
  • Blades require high airflow up to 1200 l/s per
    rack
  • Typical data centres provide 100-200 l/s per
    rack, which supports only 2-4 kW 10 times less
    than blades need to function correctly.
  • Equipment that doesnt get enough intake air will
    suck in its own exhaust and overheat.

9
Fact
  • Fact Sooner or Later, in an average active
    environment, IS managers are going to have
    problems powering and cooling racked equipment.

10
Gartner Highlights Key Predictions for IT
Organizations in 2007 and Beyond (Dec 06)
And its not just data centre operators saying
it.
  • 10 By 2008, nearly 50 of data centers worldwide
    will lack the necessary power and cooling
    capacity to support higher-density equipment.
  • With higher densities of processors
    proliferating, problems in this area continue to
    grow. Although the power and cooling challenge of
    high-density computer equipment will persist in
    the short term, a convergence of innovative
    technologies will begin to mitigate the problem
    by 2010.

11
And its not just data centre operators saying
it.
  • Data centres running out of space and power
  • Report claims high density devices to blame for
    capacity crisis
  • Robert Jaques, vnunet.com, 30 Apr 2007
  • Data centres are facing a "capacity crisis" in
    which almost half are running out of physical
    space, new research has claimed.
  • A survey of over 100 enterprise data managers
    reported that almost 90 per cent of data centres
    are three-quarters full or more.
  • The study from the Aperture Research Institute
    found that more than 43 per cent of respondents
    said that 90 per cent or more of their data
    centres are in use.
  • "Compounding these concerns is the fact that
    servers and racks are using more power than ever
    before. Nearly 38 per cent of respondents said
    that their average rack is using from 7 to 18
    kilowatts or more," the study warned.

12
Question ?
  • How Do I Get Round The Problem of Hosting Higher
    Density Equipment ?

13
Datacentre Evolution
First Generation Datacentre
  • Random Air cooling.
  • Racks all face the same direction.
  • Heat ingested by equipment.
  • Hot Spots.
  • 2Kw per Rack

Second Generation Datacentre
  • More structured cooling.
  • Hot / Cold isle implemented.
  • Still reliant on natural circulation.
  • 4Kw per Rack

14
Solution 1
  • Add more power to a standard rack.
  • Rack will overheat or have hot spots

15
Solution 2
  • Put one piece of equipment per rack.
  • Waste of space, large foot print, higher
    cost.

16
Solution 3
  • Put Everything in one rack and leave a big
    space around it.
  • Waste of space, Very large foot print,
    higher cost and will still have heat problems.

17
The Real Solution
  • High Density Equipment needs purpose build
    systems
  • Systems that work on random air cooling are not
    capable of gt2KW / rack.
  • Even with Hot Isle/Cold Isle, 4kw/rack is about
    the limit before heat builds up
  • A full rack of Blades chassis/Blades will use
    gt24kw of power.
  • 24kw of power in 24kw of heat out, thats the
    same amount as a home central heating boiler
    per rack.
  • The Market Wants higher densities and to host
    blades.
  • A conventional data centre layout with one vented
    tile per rack simply cannot cool racks over
    approximately 4 kW per rack over a sustained area
  • TelecityRedbus use APC based HD solutions but in
    unique set-ups.
  • A Quick Overview of how we approached the
    problems.

18
The Real Solution
24kW
Any System Needs to be able to Remove and
Condition 1190 litres per second ofHot Air From
the Blade Rack
Your average household Fridge has a volume of 110
litres, 10x your fridge per second of air !
19
Second Generation Datacentre
One Standard 140 l/s perforated tile per rack
20
The Third Generation Datacentre
  • The Solution
  • Cap the hot isle.
  • Close off the ends.
  • Seal the floor.
  • Fill holes with blanking plates.
  • Localise Cooling.
  • Localise Power Management.
  • 20Kw per Rack.

21
The TCRB Approach
  • Manage the Heat
  • Servers breath in cool air from outside of the
    cube and discharge hot air into the sealed hot
    isle.
  • Air Handling units vacuum the hot air out of hot
    isle.
  • Hot air is conditioned and pumped back out to the
    datacentre.
  • Cooling capacity localised to the area of heat
    generation.

Air is the heat transfer medium. The more
effectively cool air is delivered to the server
and hot air is removed back to the CRAC unit, the
better the heat transfer
22
The TCRB Approach
For each 10 rack cube, we use N2 conditioning
units
Hot Swappable Coil Cartridge Increases
availability
Nx Fans Redundant DC fans for continuous
operation
Sealed Cabinet Creates isolated environment
Air Filters Removes airborne particles from the
rack
Leak Detection Early warning water sensors
23
The TCRB Approach
These units are individual managed datacentres
within a managed datacenter
24
The TCRB Approach
  • Our latest 20kw/rack units are also a floor
    underground.
  • As well as extra security, this minimises Solar
    gain and gives a more controllable environment
    for the system to reside.
  • These units are the Highest Density available in
    EMEA.
  • TCRB HD Datacentres are PCI and ISO27001
    accredited.

25
Who Needs It ?
  • We have a real cross section of customers already
    using HD.
  • Financial Services
  • Medical Services
  • Gaming
  • Corporate
  • Anyone who needs to manage gt4kw / Rack !

26
Cost / Benefit
  • Positives
  • Very efficient Kw / Meter2 ratio, making the best
    use of available space.
  • Far easier management of systems as they are
    contained in a smaller footprint.
  • Less network cabling.
  • Less duplication of switches and power
    distribution
  • Less to go wrong..
  • MTBF is less as the equipment is cooled and
    powered correctly at all times.
  • Product life / expansion / upgrade options better
    as the units are not running at the limit of
    power and heat all the time.
  • Power densities are getting higher not lower.

27
Cost / Benefit
  • Negatives
  • Datacentre needs to have enough available power.
  • For our new datacentre, 371 M2 (4000ft2) uses
    720kw.
  • (TCRB Olivers Yard has gt7MVA available.)
  • Datacentre needs to have enough available cooling
    capacity.
  • (TCRB Olivers Yard has gt6MW of cooling
    available.)
  • Datacentre needs to have floor loading capable of
    supporting the units.
  • Risk from Large bore water pipes transversing the
    datacentre.
  • Set-up cost higher that standard racks.
  • Maintenance costs higher than standard racks.

28
Conclusion
  • Is High Density Hosting Right For You ?
  • Yes
  • If your business requires gt 4kw / Rack.
  • If your datacentre has available capacity in
    power and cooling.
  • And, If your willing to make the investment.
  • No
  • If you can live with lt 4kw / Rack.
  • If you cant fully utilise the capacity due to
    power or cooling issues.
  • If your datacentre cant take the floor loading.

29
Thanks
  • Thank You
  • Paul Court
  • Technology Services DirectorTelecityRedbus
  • Paul.Court_at_TelecityRedbus.com
  • Visit us at stand W323 
Write a Comment
User Comments (0)
About PowerShow.com