Parallel computing - PowerPoint PPT Presentation

1 / 77
About This Presentation
Title:

Parallel computing

Description:

PARALLEL COMPUTING Reference https://computing.llnl.gov/tutorials/parallel_comp/ Parallel Programming in C with MPI and OpenMP – PowerPoint PPT presentation

Number of Views:384
Avg rating:3.0/5.0
Slides: 78
Provided by: BOS56
Category:

less

Transcript and Presenter's Notes

Title: Parallel computing


1
Parallel computing
  • ????
  • ???

Reference https//computing.llnl.gov/tutorials/par
allel_comp/ Parallel Programming in C with MPI
and OpenMP
2
TextBook
  • Introduction to Parallel Computing
  • George Karypis, Vipin Kumar
  • 2003, Pearson
  • ????

3
????
  • Parallel Programming in C with MPI and OpenMP,
    McGraw Hill,2004
  • Patterns for Parallel Programming,
    Addison-Wesley,2004
  • Parallel Programming with MPI, Peter S. Pacheco,
    Morgan Kaufmann Publishers, 1997
  • OpenMP Specification, www.openmp.org
  • www.top500.org
  • Parallel Computing
  • International Journal of Parallel Programming

4
von Neumann Architecture
Comprised of four main components Memory
Control Unit Arithmetic Logic Unit
Input/Output
5
Serial computing
Traditionally, software has been written for
serial computation To be run on a single
computer having a single Central Processing Unit
(CPU) A problem is broken into a discrete series
of instructions. Instructions are executed one
after another. Only one instruction may execute
at any moment in time.
6
Parallel computing
In the simplest sense, parallel computing is the
simultaneous use of multiple compute resources to
solve a computational problem To be run using
multiple CPUs A problem is broken into discrete
parts that can be solved concurrently Each part
is further broken down to a series of
instructions Instructions from each part execute
simultaneously on different CPUs
7
Parallel computing
  • The compute resources can include
  • A single computer with multiple processors
  • An arbitrary number of computers connected by a
    network
  • A combination of both.

8
Parallel computing
  • The computational problem usually demonstrates
    characteristics such as the ability to be
  • Broken apart into discrete pieces of work that
    can be solved simultaneously
  • Execute multiple program instructions at any
    moment in time
  • Solved in less time with multiple compute
    resources than with a single compute resource.

9
The Universe is Parallel
  • Parallel computing is an evolution of serial
    computing that attempts to emulate what has
    always been the state of affairs in the natural
    world
  • many complex, interrelated events happening at
    the same time, yet within a sequence.
  • For example
  • Galaxy??formation
  • Planetary???movement
  • Weather and ocean patterns
  • Tectonic???plate drift
  • Rush hour traffic
  • Automobile assembly line
  • Building a space shuttle
  • Ordering a hamburger at the drive through.

10
Uses for Parallel Computing
  • Historically, parallel computing has been
    considered to be "the high end of computing", and
    has been used to model difficult scientific and
    engineering problems found in the real world.
  • Some examples
  • Atmosphere, Earth, Environment
  • Physics - applied, nuclear, particle, condensed
    matter, high pressure, fusion, photonics
  • Bioscience, Biotechnology, Genetics
  • Chemistry, Molecular Sciences
  • Geology, Seismology
  • Mechanical Engineering - from prosthetics to
    spacecraft
  • Electrical Engineering, Circuit Design,
    Microelectronics
  • Computer Science, Mathematics

11
Uses for Parallel Computing
  • Today, commercial applications provide an equal
    or greater driving force in the development of
    faster computers. These applications require the
    processing of large amounts of data in
    sophisticated ways.
  • For example
  • Databases, data mining
  • Oil exploration
  • Web search engines, web based business services
  • Medical imaging and diagnosis
  • Pharmaceutical design
  • Management of national and multi-national
    corporations
  • Financial and economic modeling
  • Advanced graphics and virtual reality,
    particularly in the entertainment industry
  • Networked video and multi-media technologies
  • Collaborative work environments

12
Why Use Parallel Computing?
  • Main Reasons
  • Save time and/or money
  • Solve larger problems
  • Provide concurrency
  • Use of non-local resources
  • Limits to serial computing

13
Why Use Parallel Computing?
  • Save time and/or money
  • In theory, throwing more resources at a task
    will shorten its time to completion, with
    potential cost savings.
  • Parallel clusters can be built from cheap,
    commodity components.

14
Why Use Parallel Computing?
  • Solve larger problems
  • Many problems are so large and/or complex that it
    is impractical or impossible to solve them on a
    single computer, especially given limited
    computer memory.
  • For example
  • "Grand Challenge" (en.wikipedia.org/wiki/Grand_Cha
    llenge) problems requiring PetaFLOPS and
    PetaBytes of computing resources.
  • Web search engines/databases processing millions
    of transactions per second

15
Why Use Parallel Computing?
  • Provide concurrency
  • A single compute resource can only do one thing
    at a time.
  • Multiple computing resources can be doing many
    things simultaneously.
  • For example, the Access Grid (www.accessgrid.org)
    provides a global collaboration network where
    people from around the world can meet and conduct
    work "virtually".

16
Why Use Parallel Computing?
  • Use of non-local resources
  • Using compute resources on a wide area network,
    or even the Internet when local compute resources
    are scarce.
  • For example
  • SETI_at_home (setiathome.berkeley.edu) uses over
    330,000 computers for a compute power over 528
    TeraFLOPS (as of August 04, 2008)
  • Folding_at_home (folding.stanford.edu) uses over
    340,000 computers for a compute power of 4.2
    PetaFLOPS (as of November 4, 2008)

One petaflops is equal to 1,000 teraflops, or
1,000,000,000,000,000 FLOPS.
17
Why Use Parallel Computing?
  • Limits to serial computing
  • Both physical and practical reasons pose
    significant constraints to simply building ever
    faster serial computers
  • Transmission speeds
  • Limits to miniaturization???
  • Economic limitations

18
Why Use Parallel Computing?
  • Current computer architectures are increasingly
    relying upon hardware level parallelism to
    improve performance
  • Multiple execution units
  • Pipelined instructions
  • Multi-core

19
Who and What ?
  • Top500.org provides statistics on parallel
    computing users
  • the charts below are just a sample.
  • Some things to note Sectors may overlap
  • for example, research may be classified research.
    Respondents have to choose between the two.
  • "Not Specified" is by far the largest application
    - probably means multiple applications.

20
The Future
  • During the past 20 years, the trends indicated by
    ever faster networks, distributed systems, and
    multi-processor computer architectures (even at
    the desktop level) clearly show that parallelism
    is the future of computing.

21
Modern Parallel Computers
  • Caltechs Cosmic Cube (Seitz and Fox)
  • Commercial copy-cats
  • nCUBE Corporation
  • Intels Supercomputer Systems Division
  • Lots more
  • Thinking Machines Corporation

22
Modern Parallel Computers
  • Cray Jaguar (224162 cores, 1.75PFlops)
  • IBM Roadrunner (122400 cores, 1.04PFlops)
  • Cray Kraken XT5 (98928 cores, 831TFlops)
  • IBM JUGENE (294912 cores, 825TFlops)
  • NUDT Tianhe-1 (71680 cores, 563TFlops)
  • (2009/11 TOP 5)
  • IBM 1350 (??????) (2048 cores, 23TFlops)

23
Seeking Concurrency
  • Data dependence graphs
  • Data parallelism
  • Functional parallelism
  • Pipelining

24
Data Dependence Graph
  • Directed graph
  • Vertices tasks
  • Edges dependences

25
Data Parallelism
  • Independent tasks apply same operation to
    different elements of a data set
  • Okay to perform operations concurrently

for i ? 0 to 99 do ai ? bi ci endfor
26
Functional Parallelism
  • Independent tasks apply different operations to
    different data elements
  • First and second statements
  • Third and fourth statements
  1. a ? 2
  2. b ? 3
  3. m ? (a b) / 2
  4. s ? (a2 b2) / 2
  5. v ? s - m2

27
Pipelining
  • Divide a process into stages
  • Produce several items simultaneously

28
Partial Sums Pipeline
29
Data Clustering
  • Data mining
  • looking for meaningful patterns in large data
    sets
  • Data clustering
  • organizing a data set into clusters of similar
    items
  • Data clustering can speed retrieval of related
    items

30
Document Vectors
Moon
The Geology of Moon Rocks
The Story of Apollo 11
A Biography of Jules Verne
Alice in Wonderland
Rocket
31
Document Clustering
32
Clustering Algorithm
  • Compute document vectors
  • Choose initial cluster centers
  • Repeat
  • Compute performance function
  • Adjust centers
  • Until function value converges or max iterations
    have elapsed
  • Output cluster centers

33
Data Parallelism Opportunities
  • Operation being applied to a data set
  • Examples
  • Generating document vectors
  • Finding closest center to each vector
  • Picking initial values of cluster centers

34
Functional Parallelism Opportunities
  • Draw data dependence diagram
  • Look for sets of nodes such that there are no
    paths from one node to another

35
Data Dependence Diagram
Build document vectors
Choose cluster centers
Compute function value
Adjust cluster centers
Output cluster centers
36
A graphical representation of Amdahl's law.
The speedup of a program using multiple
processors in parallel computing is limited by
the sequential fraction of the program. For
example, if 95 of the program can be
parallelized, the theoretical maximum speedup
using parallel computing would be 20 as shown in
the diagram, no matter how many processors are
used.
37
Programming Parallel Computers
  • Extend compilers
  • translate sequential programs into parallel
    programs
  • Extend languages
  • add parallel operations
  • Add parallel language layer on top of sequential
    language
  • Define totally new parallel language and compiler
    system

38
Strategy 1 Extend Compilers
  • Parallelizing compiler
  • Detect parallelism in sequential program
  • Produce parallel executable program
  • Focus on making Fortran programs parallel

39
Extend Compilers (cont.)
  • Advantages
  • Can leverage millions of lines of existing serial
    programs
  • Saves time and labor
  • Requires no retraining of programmers
  • Sequential programming easier than parallel
    programming

40
Extend Compilers (cont.)
  • Disadvantages
  • Parallelism may be irretrivably lost when
    programs written in sequential languages
  • Performance of parallelizing compilers on broad
    range of applications still up in air

41
Extend Language
  • Add functions to a sequential language
  • Create and terminate processes
  • Synchronize processes
  • Allow processes to communicate

42
Extend Language (cont.)
  • Advantages
  • Easiest, quickest, and least expensive
  • Allows existing compiler technology to be
    leveraged
  • New libraries can be ready soon after new
    parallel computers are available

43
Extend Language (cont.)
  • Disadvantages
  • Lack of compiler support to catch errors
  • Easy to write programs that are difficult to debug

44
Add a Parallel Programming Layer
  • Lower layer
  • Core of computation
  • Process manipulates its portion of data to
    produce its portion of result
  • Upper layer
  • Creation and synchronization of processes
  • Partitioning of data among processes
  • A few research prototypes have been built based
    on these principles

45
Create a Parallel Language
  • Develop a parallel language from scratch
  • occam is an example
  • Erlang
  • Add parallel constructs to an existing language
  • Fortran 90
  • High Performance Fortran
  • C

46
New Parallel Languages (cont.)
  • Advantages
  • Allows programmer to communicate parallelism to
    compiler
  • Improves probability that executable will achieve
    high performance
  • Disadvantages
  • Requires development of new compilers
  • New languages may not become standards
  • Programmer resistance

47
Current Status
  • Low-level approach is most popular
  • Augment existing language with low-level parallel
    constructs (by function call)
  • MPI and OpenMP are examples
  • Advantages of low-level approach
  • Efficiency
  • Portability
  • Disadvantage More difficult to program and debug

48
OpenMP
  • ??Shared Memory???

https//computing.llnl.gov/tutorials/parallel_comp
/
49
MPI
MPI???standard ??Distributed Memory ???????????LAM
?MPICH?
https//computing.llnl.gov/tutorials/parallel_comp
/
50
(No Transcript)
51
Organization and Contents of this Course
  • Fundamentals This part of the class covers
  • basic parallel platforms
  • principles of algorithm design
  • group communication primitives, and
  • analytical modeling techniques.

52
Organization and Contents of this Course
  • Parallel Programming This part of the class
    deals with programming using
  • message passing libraries and
  • threads.
  • Parallel Algorithms This part of the class
    covers basic algorithms for
  • matrix computations,
  • graphs, sorting,
  • discrete optimization, and
  • dynamic programming.

53
????
  • Introduction to Parallel Computing
  • Parallel Programming Platforms
  • Principles of Parallel Algorithm Design
  • Basic Communication Operations
  • Analytical Modeling of Parallel Programs
  • Programming Using the Message-Passing Paradigm
    (MPI)
  • ????(or ????)

54
????
  • Programming Shared Address Space Platforms
    (OpenMP)
  • Dense Matrix Algorithms
  • Sorting Algorithms
  • Graph Algorithms
  • Hadoop ???????
  • Map-Reduce ??????
  • Map-Reduce ?????????
  • ???? (or????)

55
??
  • ????(??,??,??,??) 40
  • ????(or ????) 30
  • ????(or????) 30

56
P2P system
  • Peer-to-peer system
  • Client acts as a server
  • Share data
  • BT

57
http//www.fidis.net/typo3temp/tx_rlmpofficelib_0c
97e8a6cd.png
58
Grid Computing
  • Distributed computing
  • Ethernet/ Internet
  • Volunteer computing networks
  • Software-as-a-service (SaaS)
  • Software that is owned, delivered and managed
    remotely by one or more providers.

59
http//www.csa.com/discoveryguides/grid/images/gri
dcomp.gif
60
Cloud Computing
  • ????????,????
  • Distributed computing
  • Web services

61
Cloud Computing
http//lonewolflibrarian.files.wordpress.com/2009/
02/cloud-computing-kitchen-sink.jpg
62
Web 2.0
  • Web 2.0 ??????????????,?????????????????,?????????
    ?????,????????,?????????????????????(Application
    Programming Interface, API)?????

63
?????(Software as a Service)
  • SaaS ?????????????
  • ?????,???????????(Service Providers)???,??????????
    ???????,??????????(on-demand)???????
  • ????????,????????????????????

64
?????(Platform as a Service)
  • PaaS ??? SaaS ????????????
  • PaaS ????????? Web Applications ????????????????
    Life cycle,?? Internet ????,??????????,??????,IT
    ???,?????????????????????????? Cloudware?

65
???????
  • (Originally Hardware as a Service, HaaS)
    ??????????,????????????(Platform
    virtualization),???????
  • ???????(Servers)?????(Network equipment)????(RAM)
    ?????(Disk)?CPU???????????
  • ?????????????????????????

66
????
  • ??????????HaaS?PaaS?SaaS?Web 2.0????????(??MapRedu
    ce?Ajax?Virtualization??),??? Internet
    ???,???????????????

67
??????!
  • 2000???????
  • 2011??
  • Nasdaq 100??2007???,????????
  • 2011/01/11-???
  • ???,??????????
  • ?????? ??Web Only 2009/03  
  • ???????????????
  • ??2010/02/08 ???ZDNet????? /????
  • ???????????
  • Yahoo news ????2011/02/11 1336
  • ?????? ?????
  • Udn news ????2011/02/21 0946

68
???,??????????
  • ????????,????????????????,???????????,????????????
  • ?Google???,??????Web 2.0?????

69
???,??????????
  • ???????,???????????????????
  • ??,??????????,MySpace?YouTube?Facebook?Twitter,???
    ??????,??????????,???????????????????

70
???,??????????
  • ??,??????,??????????????,??????????
  • ??????????,????????????????????????????,??????????
    ?

71
???,??????????
  • MySpace?YouTube??????,????????????????????????????
    ??
  • Facebook???????,???? ??Facebook Connect
  • ???????????,??Facebook?????,?Facebook?????????????
    ??????(????????????Facebook Connect)
  • Twitter???????2010??,????????,???????????????

72
???,??????????
  • ??????,??????????????,?????????,????????
  • ???????,????????????,??????????,???????????????

73
???,??????????
  • ??????????????,???????????
  • ??,??????????,????????????
  • ??,????????????,????????????????
  • ???????????,???????????????????,?????????

74
???????????????
  • ?Gartner?????,????????,?????????????
  • Gartner?????????????????????????????????????????19
    79?,?2001?????The Gartner Group?

75
???????????????
  • Gartner???????????Phillip Sargeant??????,????????I
    BM?HP????,????????? (IaaS, Infrastructure-as-a-ser
    vices)?????? (PaaS, Platform-as-a-services),??????
    ???????,????????????
  • ???????,????????,????? ???,?????????,??????????
  • ???,???????????,??????????,??????????

76
???????????????
  • ??????????????????
  • Sargeant??,??????????????,???????
  • ?????????????????????????????

77
???????????????
  • ?????????Gartner?????,????IT???,????????
  • ???
  • ????
  • Web 2.0
  • ??,?????Web 2.0???????
  • ?????????????16??15??

78
???????????????
  • ???????
  • Gartner????2012?,???????????????IT???
  • ??Gartner?????????,??????????,???????
  • Gartner??Stephen Prentice????????,???????,????????
    ????,???????
  • ??Google?Amazon??????????
  • ? ????????,??????????????????? 

79
???????????
  • ????Bilanz??????(Google)??????(Eric
    Schmidt)??,???????????????????
  • ???????(Facebook)????????Zynga??????????????,????
    ??????????,????????????????????????????

80
???????????
  • ???????(The Wall Street Journal)????,?????????????
    ?(Twitter)????????????????????100????
  • ????????????4?,?????1??????,??????,?????????

81
?????? ?????
  • ??????,?????????????????,??????Facebook??,????????
    ??(Ford),?????????,?????????????????

82
?????? ?????
  • ????????????????????????
  • ??????Zygna??????(FarmVille)??????,??90???
  • ?????(Twitter)??????,??100???
  • ???Google 60??????????????Groupon,????150????

83
?????? ?????
  • ????????,??Facebook??(???????),??????????
  • Facebook??????????,????????,???
    ?600???,?1????100????????
  • ??????Google???????,?????????550???,???????
    (Visa)?630????

84
?????? ?????
  • ????,1995?????(Netscape)??,??????,???????(dotcom)?
    ????,Facebook????????????????
  • ??????Broadsight????????(Alan Patrick
    )???,??????????????????????????,??????????

85
?????? ?????
  • ?,??????????????????(?????????????)?
  • ????????????,????????
  • ?,?????????????????????????????
  • ?,????????????(???????????)????????????????

86
?????? ?????
  • ?,???????????????
  • ?,????????,?????????????
  • ?,????(MBA)????,???????
  • ?,????????
  • ?,??????????????,??????

87
?????? ?????
  • ?,????????????????
  • ?,????????????????,???????
Write a Comment
User Comments (0)
About PowerShow.com