Designing a cluster for geophysical fluid dynamics applications - PowerPoint PPT Presentation

About This Presentation
Title:

Designing a cluster for geophysical fluid dynamics applications

Description:

Designing a cluster for geophysical fluid dynamics applications. G ran ... Seasonal/annual cycles: 0.1-1 years. Ocean eddies: 0.1-1 year. El Nino: 2-5 years. ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 70
Provided by: goranbr
Category:

less

Transcript and Presenter's Notes

Title: Designing a cluster for geophysical fluid dynamics applications


1
Designing a cluster for geophysical fluid
dynamics applications
  • Göran Broström
  • Dep. of Oceanography, Earth Science Centre,
    Göteborg University.

2
Our cluster(me and Johan Nilsson, Dep. of
Meterology, Stockholm University)
  • Grant from the Knut Alice Wallenberg foundation
    (1.4 MSEK)
  • 48 cpu cluster
  • Intel P4 2.26 Ghz
  • 500 Mb 800Mhz Rdram
  • SCI cards
  • Delivered by South Pole
  • Run by NSC (thanks Niclas Peter)

3
What we study
4
Geophysical fluid dynamics
  • Oceanography
  • Meteorology
  • Climate dynamics

5
Thin fluid layersLarge aspect ratio
6
Highly turbulentGulf stream Re1012
7
Large variety of scales
Parameterizations are important in geophysical
fluid dynamics
8
Timescales
  • Atmospheric low pressures 10 days
  • Seasonal/annual cycles 0.1-1 years
  • Ocean eddies 0.1-1 year
  • El Nino 2-5 years.
  • North Atlantic Oscillation 5-50 years.
  • Turnovertime of atmophere 10 years.
  • Anthropogenic forced climate change 100 years.
  • Turnover time of the ocean 4.000 years.
  • Glacial-interglacial timescales 10.000-200.000
    years.

9
Some examples of atmospheric and oceanic low
pressures.
10
Timescales
  • Atmospheric low pressures 10 days
  • Seasonal/annual cycles 0.1-1 years
  • Ocean eddies 0.1-1 year
  • El Nino 2-5 years.
  • North Atlantic Oscillation 5-50 years.
  • Turnovertime of atmophere 10 years.
  • Anthropogenic forced climate change 100 years.
  • Turnover time of the ocean 4.000 years.
  • Glacial-interglacial timescales 10.000-200.000
    years.

11
Normal state
12
Initial ENSO state
13
The ENSO state
14
The ENSO state
15
Timescales
  • Atmospheric low pressures 10 days
  • Seasonal/annual cycles 0.1-1 years
  • Ocean eddies 0.1-1 year
  • El Nino 2-5 years.
  • North Atlantic Oscillation 5-50 years.
  • Turnovertime of atmophere 10 years.
  • Anthropogenic forced climate change 100 years.
  • Turnover time of the ocean 4.000 years.
  • Glacial-interglacial timescales 10.000-200.000
    years.

16
Positive NAO phase
Negative NAO phase
17
(No Transcript)
18
Positive NAO phase
Negative NAO phase
19
(No Transcript)
20
Timescales
  • Atmospheric low pressures 10 days
  • Seasonal/annual cycles 0.1-1 years
  • Ocean eddies 0.1-1 year
  • El Nino 2-5 years.
  • North Atlantic Oscillation 5-50 years.
  • Turnovertime of atmophere 10 years.
  • Anthropogenic forced climate change 100 years.
  • Turnover time of the ocean 4.000 years.
  • Glacial-interglacial timescales 10.000-200.000
    years.

21
Temperature in the North Atlantic
22
Timescales
  • Atmospheric low pressures 10 days
  • Seasonal/annual cycles 0.1-1 years
  • Ocean eddies 0.1-1 year
  • El Nino 2-5 years.
  • North Atlantic Oscillation 5-50 years.
  • Turnovertime of atmophere 10 years.
  • Anthropogenic forced climate change 100 years.
  • Turnover time of the ocean 4.000 years.
  • Glacial-interglacial timescales 10.000-200.000
    years.

23
Ice coverage, sea level
24
What model will we use?
25
MIT General circulation model
26
MIT General circulation model
  • General fluid dynamics solver
  • Atmospheric and ocean physics
  • Sophisticated mixing schemes
  • Biogeochemical modules
  • Efficient solvers
  • Sophisticated coordinate system
  • Automatic adjoint schemes
  • Data assimilation routines
  • Finite difference scheme
  • F77 code
  • Portable

27
MIT General circulation model
Spherical coordinates
Cubed sphere
28
MIT General circulation model
  • General fluid dynamics solver
  • Atmospheric and ocean physics
  • Sophisticated mixing schemes
  • Biogeochemical modules
  • Efficient solvers
  • Sophisticated coordinate system
  • Automatic adjoint schemes
  • Data assimilation routines
  • Finite difference scheme
  • F77 code
  • Portable

29
MIT General circulation model
30
MIT General circulation model
31
MIT General circulation model
32
MIT General circulation model
33
MIT General circulation model
34
MIT General circulation model
35
MIT General circulation model
36
MIT General circulation model
37
Some computational aspects
38
Some tests in INGVAR
  • (32 AMD 900 Mhz cluster)

39
Experiments with 606020 grid points
40
Experiments with 606020 grid points
41
Experiments with 606020 grid points
42
Experiments with 12012020 grid points
43
MM5 Regional atmospheric model
44
MM5 Regional atmospheric model
45
MM5 Regional atmospheric model
46
Choosing cpus, motherboard, memory, connections
47
Specfp (swim)
48
Run time on different nodes
49
Choosing interconnection
  • (requires a cluster to test)
  • Based on earlier experience we use SCI from
    Dolphinics (SCALI)

50
Our choice
  • Named Otto
  • SCI cards
  • P4 2.26 GHz (single cpus)
  • 800 Mhz Rdram (500 Mb)
  • Intel motherboards (the only available)
  • 48 nodes
  • NSC (nicely in the shadow of Monolith)

51
Otto (P4 2.26 GHz)
52
Scaling
Ingvar (AMD 900 MHz)
Otto (P4 2.26 GHz)
53
Why do we get this kind of results?
54
Time spent on different subroutines
606020
12012020
55
Relative time Otto/Ingvar
56
Some tests on other machines
  • INGVAR 32 node, AMD 900 MHz, SCI
  • Idefix 16 node, Dual PIII 1000 MHz, SCI
  • SGI 3800 96 Proc. 500 MHz
  • Otto 48 node, P4 2.26 Mhz, SCI
  • ? MIT, LCS 32 node, P4 2.26 Mhz, MYRINET

57
Comparing different system (12012020 gridpoints)
58
Comparing different system (12012020 gridpoints)
59
Comparing different system (606020 gridpoints)
60
SCI or Myrinet?
12012020 gridpoints
61
SCI or Myrinet?
(606020 gripoints)
12012020 gridpoints
(ooops, I used the ifc Compiler for these tests)
62
SCI or Myrinet?
(606020 gripoints)
12012020 gridpoints
(1066Mhz rdram?)
(ooops, I used the ifc Compiler for these tests)
63
SCI or Myrinet?(time spent in pressure calc.)
(606020 gripoints)
12012020 gridpoints
(1066Mhz rdram?)
(ooops, I used the ifc Compiler for these tests)
64
Conclusions
  • Linux clusters are useful in computational
    geophysical fluid dynamics!!
  • SCI cards are necessary for parallel runs gt10
    nodes.
  • For efficient parallelization gt505020 grid
    points per node!
  • Few users - great for development.
  • Memory limitations, for 48 proc. a 500 Mb,
    1200120030 grid points is maximum (eddy
    resolving North Atlantic, Baltic Sea).
  • For applications similar as ours, go for SCI
    cards cpu with fast memory bus and fast memory!!

65
Experiment with low resolution (eddies are
parameterized)
66
(No Transcript)
67
(No Transcript)
68
Experiment with low resolution (eddies are
parameterized)
69
Thanks for your attention
Write a Comment
User Comments (0)
About PowerShow.com