CSCE 548 Code Review - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

CSCE 548 Code Review

Description:

Beware of the 'Big Brother' effect. Use of metrics role of manager. The Ego effect ... 11. Worst Practices. Assuming that only important software needs to be secure ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 19
Provided by: far1
Category:
Tags: csce | big | brother | code | review

less

Transcript and Presenter's Notes

Title: CSCE 548 Code Review


1
CSCE 548 Code Review
2
Reading
  • This lecture
  • McGraw Chapter 4
  • Recommended
  • Best Practices for Peer Code Review,
    http//www.smartbear.com/docs/BestPracticesForPeer
    CodeReview.pdf
  • Worst Practices in Developing Secure Software,
    http//www.infosecwriters.com/text_resources/pdf/W
    orst_Practices_in_App_Sec.pdf
  • Next lecture
  • Architectural Risk Analysis Chapter 5

3
Application of Touchpoints
External Review
3. Penetration Testing
1. Code Review (Tools)
6. Security Requirements
4. Risk-Based Security Tests
2. Risk Analysis
7. Security Operations
5. Abuse cases
2. Risk Analysis
Requirement and Use cases
Architecture and Design
Test Plans
Code
Tests and Test Results
Feedback from the Field
4
Code Review (Tool)
  • Artifact Code
  • Implementation bugs
  • Static Analysis tools
  • White Hat activity

5
Software Bugs
  • Programming bugs
  • Compiler catches error, developer corrects bug,
    continue development
  • Security relevant bug
  • May be dormant for years
  • Potentially higher cost than programming error
  • Who should be responsible for security bug?
  • Software developer?
  • Security expert?

6
Manual vs. Automated Code Review
  • Manual Code Review
  • Tedious, error prone, exhausting
  • Need expert with the mindset of an attacker!
  • Static analysis tools
  • Identify many common coding problems
  • Faster than manual
  • Need developer with basic understanding of
    security problems and how to fix detected ones

7
Best Practices
  • Peer Code Review recommendations from SmartBear
    Software
  • Based on Cisco code review study
  • Over 6000 programmers and 100 companies lessons
    learned results
  • Light weight code review

8
Best Practices Recommendations 1.
  • Review fewer that 200-400 lines
  • Optimizes number of detected vulnerabilities
    (70-90 )
  • Aim for an inspection rate of less than 300-500
    line of code/hour
  • Faster is not better!
  • Based on number of detected vulnerabilities
  • Do not spend more than 60-90 mins on review at a
    time
  • Efficiency drops after about an hour of intense
    work
  • Make developers annotate their code
  • Encourage developers to double-check their work
  • Reduce the number of vulnerabilities in the code

9
Best Practices Recommendations 2.
  • Establish quantifiable goals for code review
  • External metrics e.g., reduced of support
    calls
  • Internal metrics e.g., defect rate
  • Maintain checklist
  • Prevent omissions of important security
    components
  • Verify that defects are actually fixed
  • Need good collaborative review of software
  • Managers must support code review
  • Support team building and acceptance of process

10
Best Practices Recommendations 3.
  • Beware of the Big Brother effect
  • Use of metrics role of manager
  • The Ego effect
  • User code review to encourage developers for good
    coding habits
  • Review at least 20-33 of code
  • Light weight style of review
  • Tool assisted
  • Just as efficient as formal, heavy weight review
    but 1/5 less time required

11
Worst Practices
  • Assuming that only important software needs to be
    secure
  • Emphasizing making the deadlines instead of
    writing good code
  • Having IT make all risk management decisions
  • Not considering security during the entire SDLC
  • Assuming the software wont be attacked
  • Not doing any security testing
  • Not planning for failure
  • Counting on security through obscurity
  • Disallowing bad input instead of only allowing
    good input
  • Software that is not secure by default
  • Implementing Cryptographic algs. from scratch

12
Source Code vs. Binary Code Check
  • What to check? Source Code or Binary Code?
  • Source Code
  • See the logic, control, and data flow
  • See explicit code lines
  • Fixes can be carried out on the source code
  • Compiled Code
  • May need reverse engineering (disassemble,
    decompile)
  • Finding a few vulnerabilities is easy. Finding
    all is difficult
  • Fixes may be incorporated as binary modules or
    external filters

13
How Static Analysis Works?
  • Look for fixed set of patterns or rules
  • Syntactic matches
  • Lexical analysis
  • Flow analysis (control flow, call chains, data
    flow)
  • False negatives
  • False positives
  • Sound tool given a set of assumptions, the
    static analysis tool does not produce false
    negatives
  • Commercial tools unsound

14
Static Analysis
  • Identify vulnerable constructs
  • Similar to compiler preprocess source file and
    evaluates against known vulnerabilities
  • Scope of analysis
  • Local one function at a time
  • Module-level one class (or compilation unit) at
    a time incorporates relationships between
    functions
  • Global entire program all relationships
    between functions

15
Rule Coverage
  • Taxonomy of coding errors
  • Language specific (e.g., C/C, Java, etc.)
  • Functions or APIs
  • Academia vs. industry

16
Commercial Tools
  • Easy to use still need expert knowledge
  • Can process large code (millions of lines)
    efficiently
  • Need competence of reviewer of results
  • Encapsulates knowledge (known vulnerabilities)
    and efficient flow analysis
  • Encourages efficient and secure coding

17
Tool Characteristics
  • Be designed for security
  • Support multiple tires
  • Be extensible
  • Be useful for both security analyst and developer
  • Support existing development process
  • Make sense for multiple stakeholders

18
Next Class
  • Architectural Risk Analysis Chapter 5
Write a Comment
User Comments (0)
About PowerShow.com