Adaptive Airflow Management Cool Green - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Adaptive Airflow Management Cool Green

Description:

Serious overheating. Servers shutting down. 24. Best ... Eliminate server overheating. Allow for expansion. Reduced energy costs. Allow CRAC maintenance ... – PowerPoint PPT presentation

Number of Views:124
Avg rating:3.0/5.0
Slides: 39
Provided by: afcomnew
Category:

less

Transcript and Presenter's Notes

Title: Adaptive Airflow Management Cool Green


1
Adaptive Airflow ManagementCool Green
Wally Phelps Product Manager AdaptivCool wally
.phelps_at_degreec.com
Partners in Thermal Management
2
Agenda
  • The Thermal Bottleneck
  • Room Scale Intelligent Cooling
  • Airflow and Humidity
  • Best Practices and Their Limits
  • Adaptive Airflow in Action
  • Case Studies
  • Summary

3
ASHRAE Power Trend
4
The Real Culprit
  • Chip Density - Exponential
  • Packaging Density - Exponential
  • IT demand - Exponential
  • Rack Density - Exponential
  • Junction Temperature - UNCHANGED
    (reliable silicon gate operation)
  • IT Thermal Bottleneck is Created

5
Legacy Data Centers
  • Designed for 1-3KW racks
  • Perimeter CRACs
  • Raised floor supports Cooling, Piping and
    Cabling
  • Low ceiling heights
  • Poor airflow distribution
  • Mixing and low ?T
  • Cannot support todays density ???

6
Room Scale Intelligent Cooling
  • A Data Center Cooling Solution That
  • Uses tightly managed ACTIVE airflow to.
  • Bring cool air to racks - where, when, right
    amount - dynamically
  • Help return hot air to CRACs
  • Minimize mixing (cooler racks energy efficient
    cooling)
  • Work with or without containment panels (no fire
    code issues)
  • Gain typically 30 in cooling efficiency
  • Legacy Data Centers CAN and DO support
    todays density We have customers supporting
    200W/sq ft in sites designed for 100W/sq ft

7
The Key System Elements
8
System Block Diagram
  • CFD analysis
  • Underfloor supply air movers
  • Overhead returns
  • Sensors
  • Cooling Resource Manager
  • Web based monitoring

LONWORKS BACNET SNMP E-Mail SMS
RS485 NETWORK
VPN
24x7 REMOTE MONITOR
COOLING RESOURCE MANAGER
CRAC UNITS
INTELLIGENT SUPPLY
INTELLIGENT RETURN
SENSOR NETWORK
9
EPA Report Aug. 2007
30 improvement in infrastructure energy
efficiency from improved airflow
management Airflow is the single infrastructure
improvement that can be done without
disruption
Page 9 of EPA Final Report
10
SVLG/Accenture Report July 2008
11
But I Have Enough Tonnage!
  • Typical Data Center is overcooled by 2.6X(Uptime
    Institute Study)
  • Is it delivered in the right place?(at the right
    time, and in the right amount?)
  • Can the heat return back to the CRAC?
  • Are the CRACs fighting?(and reducing capacity)

12
Data Center Cooling Rudiments
  • Air IS the cooling medium from chip ? cooling
    coils
  • Air is lazy - mixing is the result
  • Mixing creates Hotspots and wastes cooling
    capacity
  • Typical response turn down setpoints, overcool
  • /- 1ºF setpoint equates to /- 4 efficiency
  • Humidification control costs

13
Common Airflow Problems
v
  • Mixing
  • Recirculation - best practices or distribution
  • Short circuiting - best practices
  • Leakage best practices
  • Poor return path - distribution
  • Humidity management - distribution
  • Under floor obstructions - distribution
  • Venturi reversal - distribution
  • Vortex generation - distribution
  • Legacy racks best practices

Most Data Centers have at least 3 or more of
these issues
14
1st Step - CFD
  • High Air Movement Turbulence
  • Intuition does not work
  • Unintended airflow paths
  • Use CFD to visualize and problem solve in 3D

15
Airflow Problem Examples
MIXING (Cool and warm air mixbefore server
intakes)
VORTEX GENERATION UNDERFLOOOR (CRACS slightly
offsetor at right angles)
16
Airflow Problem Examples
VENTURI REVERSAL (Racks too close to CRACS,low
or NEGATIVE flow)
UNDERFLOOR OBSTRUCTIONS (Restricts airflow,
uneven distribution)
17
Data Center Humidity Control
  • The Cooling Process

Sensible Cooling Reduces Air Temperature
Return Air
Latent Cooling Condenses Water Vapor
Courtesy of Liebert
Electronics are 100 dependent on Sensible Cooling
18
Data Center Humidity Control
T (F)
Water Phase Diagrams
80
L I Q U I D
  • Humidity Control 2 Phase Changes

V A P O R
Return Air
970 BTU/lb water
50
Cooling Coils Dehumidify(Phase Change VaporgtLiq.)
Condensation (heat to refrigerant)
20
BTUs for 1 lb water
IR Heaters Humidify (Phase Change Liq.gtVapor)
80
L I Q U I D
V A P O R
970 BTU/lb water
50
Courtesy of Liebert
Re-Humidification (Vaporize Water)
Phase Changes are Energy Intensive! Humidity
Control Can be gt30 of Cooling Energy!
20
19
Data Center Humidity Control
  • Comparison 20 Ton CRAC


Latent Cooling
40-55 RH
kBtu/hr
Sensible Cooling

Airflow balance and separation affects
dehumidification rate
42 RH
50 RH
20Tons 75ºF return 47ºF ECWT 67.8KBtu/hr
delta 4-6K / yr chilled water cost (exclusive
of re-humidification)
20
Airflow Balance Example
Hot Aisle / Cold Aisle
School room
Dehumid. Dehumid. Humid. Humid.
  • Separated supply return
  • Cooler (or more) IT equipment
  • Balanced CRAC loading
  • Lower humidity control costs
  • Improved energy efficiency
  • Mixing Hotspots
  • Unbalanced CRAC loads
  • Poor humidity control
  • Wasted energy and capacity

21
Best Practices
  • Hot Aisle/Cold Aisle
  • CRACs perpendicular to rows
  • Lower density racks at ends of rows
  • Long rows with no space between racks
  • No perftiles in Hot Aisles
  • Minimize cable cutouts, (especially in Hot
    Aisles)
  • Load servers from bottom to top, no spaces
  • Use blanking panels
  • Use lower humidity
  • Prevent infiltration of unconditioned air

22
Best Practices Only (Example)
Mixed 88W/sq ft load 310 KW 130T Cooling 34
Cooling CFM Margin 20 raised floor with no
restrictions (not typical) Best practices
employed This model is better than 90 of Data
Centers built before 2004
Hotspots still present Prevents energy
savings from raising setpoints or turning off
CRACS
23
Best Practices Only (Example)
Examples of previous slide with CRAC Failures 9
CFM Margin still exists Serious
overheating Servers shutting down
24
Best Practices Cant Solve
  • Underfloor obstructions
  • Poor return path
  • Venturi effect
  • Vortex generation
  • Severe legacy placement issues
  • Difficult site envelope issues
  • Its a Fundamental Airflow DISTRIBUTION Problem!

U servers 160 CFM / KW Blades 120
CFM / KW
25
Solving Airflow Distribution
  • Design Requirements Data Center Applications
  • Overcome fundamental DISTRIBUTION issues
  • Robust, reliable, user friendly
  • Non Intrusive, non disruptive, no downtime to
    install
  • Dynamically adjust to the changing Data Center
  • Modular, scaleable, reconfigurable

26
Room Scale Intelligent Cooling (RSIC) Example
Previous example with installation of Room Scale
Intelligent Cooling Location and quantity of
underfloor airmovers based on CFD and rack density
Hotspots eliminated Margin available to move
setpoints Possibility to shut down CRACS
27
Room Scale Intelligent Cooling (RSIC) Example
Room Scale Intelligent Cooling Example with CRAC
Failures 9 CFM margin Hot spots now in check
28
Comparison
Traditional
Room Scale Intelligent Cooling
Blank
29
Cooling Resource Manager
  • Easy to use Dashboard
  • Real time environmentals
  • Zone of Influence
  • Cooling management
  • CRAC shedding (energy saving)
  • CRAC failure management
  • Web enabled
  • 3 Tiers of redundancy
  • Trending, alarms, history
  • Interface to BACnet, LonWorks, SNMP and others

30
Results Brokerage Firm
  • Client Requirements
  • Reduce server temperatures
  • SNMP alarms
  • Planning for expansion
  • Reduced energy costs
  • Allow CRAC maintenance
  • Solution
  • Thermal analysis
  • AdaptivCool RSIC
  • Benefits
  • Average rack top temp reduced 8ºF
  • 24 Lower energy use
  • CRAC maintenance possible
  • Integrated SNMP alarms trending

The site is miles ahead of where we were before
AdaptivCool started the project
31
Results Electronics Firm
  • Client Requirements
  • Eliminate server overheating
  • Allow for expansion
  • Reduced energy costs
  • Allow CRAC maintenance
  • Solution
  • Thermal analysis
  • AdaptivCool RSIC
  • Benefits
  • Average rack top temp reduced 4ºF
  • Server Hotspots eliminated
  • 24 Lower energy use
  • CRACs balanced, maintenance possible

Our main data center with the AdaptivCool
solution installed has shown significant
improvements in cooling. Everything AdaptivCool
promised us came true
32
Results Co-Location Site
69F CRAC setpoints
  • Client Requirements
  • Eliminate hotspots
  • Recover lost capacity
  • Support blades in co-location
  • Real-time monitoring
  • Solution
  • Thermal analysis
  • AdaptivCool RSIC
  • Benefits
  • Hotspots eliminated
  • CRAC setpoints raised 1-3F
  • Trending and alarms
  • Improved redundancy
  • 22 lower energy use

70-72 F CRAC setpoints
With AdaptivCool managing our cooling, we now
support (and charge for) 200w/sq ft customers in
a space designed for only 100w/sq ft.
33
Results MFG Firm
  • Client Requirements
  • Eliminate hotspots
  • Manage summer heat
  • Reduce energy usage
  • Solution
  • Thermal analysis
  • AdaptivCool RSIC
  • Benefits
  • Hotspots eliminated
  • CRAC setpoints raised 1-2F
  • Thermal margin improved
  • 18 lower energy use

We calculated we are saving 18 on cooling costs
and will have less to worry about in the warmer
months.
34
SAAS Model
  • RSIC offered as an SLA
  • Remote Monitoring
  • Monthly Summary Reports Thermal, Energy, Alarms
    etc.
  • Quarterly updates - Capacity, Trends, Critical
    Items
  • CFD Updates
  • On-Call Experts

35
Services
  • New Site and Expansion Consulting
  • Optimal design considerations
  • Close Coupled, In-Row, Down Flow, Up Flow
  • Phased implementation roadmaps
  • Minimized capex
  • Minimized opex

36
Do IT Yourself Solution
  • 30 Minute HotSpot Solution
  • Off the shelf, self install
  • Same 1200 CFM per underfloor unit 10KW Racks
  • 4 Models
  • HT-500 model upgradeable to RSIC

37
Thermal Peace of Mind
  • Solve hot spots at the room, row and rack level
  • Better manage the cool air you have - where,
    when, and in the right amount
  • Cooling energy savings of 20-30-40 or more
    IT equipment in the same facility
  • Monitoring and on-call expertise
  • Extend the useful life of existing Data Centers

www.AdaptivCool.com
38
  • Thank You

www.AdaptivCool.com
Write a Comment
User Comments (0)
About PowerShow.com