Title: The DETER Testbed
1The DETER Testbed
- Anthony D. Joseph
- Shankar Sastry
- University of California, Berkeley
2DETER Testbed Motivation
- Inadequate deployment of security technologies
- Despite 10 years investment in network security
research - Lack of experimental infrastructure
- Testing and validation occurs mostly at small
scales - Lack of objective test data, traffic and metrics
- cyber DEfense TEchnology Experimental Research
Testbed - Open to all researchers (govt, industrial,
academic)
3DETER Testbed Goals
- Design construct testbed for network security
experiments - Attack scenarios/simulators, topology generators,
background traffic, monitoring/visualization
tools - Do research on experimental methodology for
network security - Scientifically rigorous frameworks/methodologies
- Do research on network security
- Attack detection and countermeasure tools
4DETER Testbed Capabilities
- Real systems, Real code, Real attacks!
- 400 PCs with 5 Gigabit Ethernet links each
- Supports all x86 OSes Windows, Linux, UNIX
- Modeling large-scale wide-area networks
- Nodes can be used as clients, routers, and
servers - Examining the effects of rare events
- Evaluating commercial hardware/prototypes
- Vendor-neutral environment
- Intrusion detection/protection appliances
- Interactions between different vendors products
- Performance testing normal and under attack
5Example Experiments
- Slammer BW-limited Scanning Worm
- ICSI and PSU modeling propagation through the
Internet WORM04 paper - Virtual node model of the response of subnets
- 1/64th scale Internet
- Other experiments
- Collaborative defenses
- Large-scale enterprise network simulation
6User
Internet
ISI Cluster
UCB Cluster
User files
Boss Server
User Server
Download Server
Node Serial Line Server
Control Network
IPsec
Control Network
CENIC
PC
PC
PC
Power Contler
Power Contler
PC
PC
IPsec
Cisco/Nortel SW
Foundry/Nortel SW
trunk
trunk
7DETER Project Timeline
- Funding
- DETER NSF and DHS HSARPA (Sept 03 Feb 07)
- DECCOR NSF CRI program (Jul 05 Jun 07)
- DIPLOMAT DHS HSARPA (Sept 06 )
- DIRECT AFOSR DURIP program (Apr 06 Mar 07)
- Experience to date over 40 projects
- DDoS Attack-Defense, Worm Behavior
Characterization, Network Routing Attack-Defense - Security course support at UCB, commercial
devices - DHS cybersecurity 2006 exercise
- Working with Cornell to federate with their
testbed - Interesting latency challenges
- Also Utah and Vanderbilt testbeds
DETER Community Workshop August 6 - 7,
2007 (before USENIX Tech Conf) Boston, MA
8DETER Testbed Software
- Extended Utah Emulab control plane software
- Experiment creation GUI and security features
- Experimental node OS support
- RedHat Linux 7.3, FreeBSD 4.9, or Windows XP
- Users can load arbitrary code, in fact
- User has root access to all allocated nodes!
- No direct IP path into experimental network
- Encrypted tunnels across Internet (SSL/SSH/IPsec)
- Secure process replaces OS after each experiment
- Optional disk scrub after experiments
9Upcoming Software Capabilities
- Reusable library of realistic, rigorous,
reproducible, impartial tests (Archived
Experiments) - For assessing attack impact/defense effectiveness
- Test data, test configurations, analysis
software, and experiment automation tools - Usage examples and methodologies (WorkBench)
- Test selection recommendations
- Test cases, results, and benchmarks
10DETER Clusters
UCB
Open to community request an account at
http///www.deterlab.net/
11Related Effort with OSD/NII
- GIG context Vast Networks, People and Technical
Systems, and Embedded systems - Insufficient large complex systems analytical
methods limit sensor, data, and network
capabilities Tactical Edge and Warfighter
Assurance - NSF System of Networked Embedded Devices workshop
(10/05) - The few successful distributed systems spent
50-75 of their development budget on debugging,
testing and validation - Solving the Analytic Gap Advanced Mathematics
for Scale Complexity (w/ Kirstie Bellman,
Aerospace Corp) - Map DoD operational deficits to potentially
important mathematical RD problems - Identify new approaches for evaluating the
scalability of methods - Provide better basis for decision makers to
believe analysis claims - Three driving problems Testbed validation,
Detecting anomalous traffic flows, DoD-COTS
interactions
12Approach for Closing the Gap 2006
- Study team led by UC Berkeley and The Aerospace
Corporation focused on two major issues with
number of participants - DoD scale problems
- Where is DoD hardest hit by lacking sufficient
analytic methods for very large systems? What
problems are not being addressed? How does DoD
currently work around problems of scale? - Advanced Mathematics for Problems in Scale and
Complexity - Where should we focus research in order to
develop better methods to - Design, control, and analyze large complex
systems - Reason about the limitations of mathematical and
computational approaches. - Evaluate and compare computational methods and
certify tools..
132006 Accomplishments
- Discussed analytic gap study goals and content
with large number of groups also was able to
leverage off existing conferences and workshops - Have set up eager receivers for study results at
DARPA, NSF, National Academies, and have received
interest from NSA and others - Completed initial survey of large scale system
methods available and initial set of analytic
research needsReport in preparation - Progress on developing an approach to analyzing
the appropriateness and scalability of analytic
methods Report in preparation - Progress on GIG analytic gaps
14General Comments on 2006 Findings
- Need methods to assess the relative strengths of
specific modeling, computational and mathematical
methods - Basic work needed on how to characterize complex
systems (e.g., better descriptive and formal
languages) - Basic work needed on defining scalability and
discovering which properties of networks can be
guaranteed as one scales up or down. - Some methods may not work when there are not
sufficient population effects (e.g, statistics)
or when there are finite numbers of elements
(e.g., ensemble methods). - Essential to the evaluation of testbeds
- Develop methods to design networks to be
analyzable and scalable and to protect desirable
properties
15Some 2006 Key Analytic Issues
- Multi-criteria optimization
- The impact of single events, cascades in networks
- Can one determine the likely speed and spread of
different types of cascades - Methods to accumulate the risks of rare events
- also related, statistics of extreme values
- Measuring and controlling emergence
- Methods to handle the accumulation of weak
effects - Methods to support different large scale system
strategies, e.g. aggregation, abstraction,
partitioning (particularly better methods to map
among the levels of multi-resolution systems)
16More 2006 Key Analytic Issues
- Characterizing solution spaces for quick
responses later - Methods to decide whether a sub-graph is
characteristic of a much larger graph (related
questions for projections and testbeds) - Developing mathematics for evaluating partial or
intermediate results - Related NSF DDDAS the creation of new
mathematical algorithms with stable and robust
convergence properties under perturbations
induced by dynamic data inputs - Related questions on whether adding time or
more computation results in a better solution - Combining results from heterogeneous methods
- Related issue of integrating across heterogeneous
nets - MORE detail in coming reports
17Follow-up Studies on Specific Gig Needs
- Beyond inspiration to trusted results
disquieting lack of confidence in models and
analysis - Apply rigorous mathematical analysis and modeling
practice to any specific study - Understand what the impact of assumptions are and
actually test those assumptions and impacts. - Give GIG decision-makers immediate help and
actionable results - Screen new mathematical developments for
application - Choose specific topics so that also address some
of the major analytic gap themes - Inspire math community with a set of driving
problems - to use the best of current art and to shape
research - Cleaned up and filtered as necessary
18GIG General Analytic Gaps
- In September 06 we identified the following GIG
analytic gaps - We will consider these general problems for each
of the specific GIG study topics described next - (1) Analyze the robustness and the coverage in
scenarios used for design and analysis of the
GIG, - (2) Evaluate the implications of the assumptions
about military vs. commercial traffic and
networks - the proportion of embedded devices vs. human
users - the randomness of transactions
- the differences between usages during peacetime
and military operations - (3) Design the GIG to eliminate or lessen the
impact of anomalous flows and cascades, and - (4) Test the scalability of both GIG testbeds and
models.
19Specific Topics for Mission Assurance and the
Enterprise
- Three focus areas
- Dealing with the Tactical Edge
- Spectrum management for thousands of frequencies
- Preventing network storms and providing stable
cores - Key challenges in focus areas
- The dynamic environment
- Scaling up/down from testbeds to reality
- Single points of failure and cascading failures
20Dynamic Challenges
- What is the impact of dynamic environments on
algorithmic complexity? - Tactical Edge MANET formation protocols
- Spectrum Management de-confliction for 3,000
frequencies - Netstorms/Stable Cores Changing links causing
routing flaps