T76'4115 Iteration Demo - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

T76'4115 Iteration Demo

Description:

For general information on the project see Project Planning Phase ... Use the Azureus package as an example. Open the package, demo progress indicator. ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 20
Provided by: JariVa9
Category:
Tags: azureus | demo | iteration | t76

less

Transcript and Presenter's Notes

Title: T76'4115 Iteration Demo


1
T-76.4115 Iteration Demo
  • OSLC 2.0
  • I1 Iteration12.11.2006

2
Agenda
  • Project status (10 min)
  • achieving the goals of the iteration
  • project metrics
  • Used work practices (5 min)
  • Work results (30 min)
  • presenting the iterations results
  • demo
  • For general information on the project ? see
    Project Planning Phase Progress Review slides

3
Introduction to I1
  • 26th Oct 12th Dec ? 6 weeks
  • Three sub-iterations
  • Sub-iteration 1 26.10 11.11
  • Sub-iteration 2 13.11 25.11
  • Sub-iteration 3 26.11 8.12
  • Detail iteration plan can be found in detailed
    sub-iteration plan

Note additional work for sub-iterations to the
course requirement. The main idea is to visualize
the working result and get constant feedback from
customer!
4
Status of the iterations goals
  • Goal 1 implement and carry out sub-iteration
    development work
  • OK
  • Goal 2 deliver promised features to the customer
  • OK
  • Goal 3 Architecture design complete and approved
    by customer
  • OK
  • Truest building with customer and mentor
  • OK
  • Visualize intermediate development result to
    customer and get early feedback
  • OK (3 sub-iterations with specific deliveries to
    customer)
  • Finish all deliveries with high quality
  • OK
  • Achieve a good iteration result
  • OK (development result is acknowledged by both
    customer and mentor)

5
Status of the iterations deliverables
  • Project plan
  • OK, except chapter 6.4 iteration plan for next
    phase
  • Requirements document
  • OK, all important requirements documented in
    detail
  • Use cases 1, 2, 3, 5, 6,7,8 implemented and
    tested
  • OK
  • Functional requirements F1, F2, F5.1, F6.1, F6.2,
    F7, F8.1, F8.2, F9, F10, F11, F12, F14, F15, F16,
    F17, F19, F20 are implemented (18 out of 27 reqs
    in total)
  • OK
  • General Technical specification
  • OK (approved by customer)
  • Test charter
  • OK
  • Test result and log
  • OK
  • Progress result
  • OK
  • OSLC 2.0 source and build for Implementation 1
  • OK

6
Realization of the tasks
  • Show status and effort distribution per task or
    task group
  • (not on too detailed level)
  • discuss major discrepancies between the plan and
    realizations
  • unfinished/removed tasks?
  • any unplanned tasks?

7
Resource usage
Original plan (in the beginning of the iteration)
Realization and updated plan
(realized hours and updates)
  • We decide to implement most of the features this
    year. (2/3 of all feature implemented)
  • Yuan lots of time spent on updating
    documentation according to mentor comments
  • Veli-Jussi GUI design took more time than
    expected as implementation with Java Swing is
    more completed than other approach
  • Jussi was traveling and not available for more
    than 2 weeks.
  • Mika the Java source file parsing is
    complicated. The original plan is not realistic
    for the amount of implementation work.

8
Defects and other quality metrics
  • Description of blocker and critical bugs found
    and open
  • None open blocker bugs
  • None open critical bugs
  • One open minor bug References-checkbox becomes
    out-of-synch
  • Other QA metrics
  • Test coverage 100 (test cases cover all the
    proposed requirements that can be tested)
  • Development progress 64 (22 test cases designed
    for the whole system and 14 test cases carried
    out by the end of Iteration 1)
  • Current quality based on test cases 100 (No
    failed test cases before delivery)
  • Defects solution 97 (33 bugs found, out of
    which 32 resolved by 10.12.2006)

9
Quality dashboard
  • Evaluate the quality of the parts of the system
  • QA practices
  • Pair code review
  • Continuous refactoring
  • Documentation review
  • Heuristic review
  • Testing
  • JUnit testing 6 critical modules unit tested
  • Test case based testing
  • Exploratory testing
  • Performance testing
  • Confidence on quality is based on testing and
    test results
  • Defects management Bugzilla, bug report
    systemized

10
Quality goals
  • Evaluation of the status of the project's quality
    goals

11
Software size in Lines of Code (LOC)
  • No source code reuse from last years project
  • lots of new functinality was implemented
  • Good commenting habit in general
  • Unit testing done for integration

12
Changes to the project
  • End user business review ? to get early customer
    feedback and increase confidence of the final
    delivery
  • Note This is additional work to the course
    requirements
  • Lauri took over the integration development ?
    this is something we did not think thorough
    enough
  • Few requirements are moved to next years
    development
  • CVS integration ? Next year
  • special matching feature ? Next year
  • Few new requirements
  • Calculation on match finger print ? next year
  • Use keyword matching to narrow down the license
    matching possibilities ? next year
  • Consideration for upcoming GPL 3 license

13
Changes to the architecture from initial draft
Architecture has mostly been stable, a few
changes have been made
  • Events to the GUI
  • generated when processing of the package starts
    or stops
  • also when a single file has been processed
  • used to implement progress bar
  • Iterator for file packages
  • speeds up opening compressed packages (no need to
    unpack to temporary directory first)
  • currently slows down opening single files
  • improvement has been proposed
  • Matching
  • exact and partial matching is now implemented in
    same algorithm
  • if match- is 100 -gt exact match
  • caching of internal data structures to improve
    performance
  • Small additions
  • some extra methods and variables in other module
    interfaces

14
Risks
15
Used work practices
  • Work practices carried out
  • Bugzilla for implementation task management and
    defect management
  • TikiWiki group documentation, discussion forum,
    communication, task distribution, group calendar
  • Build process ? ant
  • Version control ? CVS
  • Code review
  • Skype meeting saves time and money ?
  • Weekly time tracking
  • Weekly internal meeting for task planning, status
    report and risk management
  • meeting agenda minutes
  • Agile process
  • two-week iterations
  • 2-weeks tested build delivery to customer
  • Unit testing ? provide the first level of quality
    confidence
  • Get early feedback from customer
  • Our overall process framework is very good and
    every member follows the process ? we will
    continue with this process in the next year

16
Results of the iteration
  • The major deliverables of the iteration
  • project plan (Implementation 1 Section)
  • requirements (with detailed use cases)
  • General system architecture
  • QA plan
  • Test session charter
  • Test Cases, test result, and test log
  • QA report
  • Progress report
  • Working application and source code
  • Software Demonstration
  • Demonstration brief
  • distributing a demo script to the audience helps
    following the demo
  • GUI Demo
  • Command line demo

17
DEMO SCRIPT FOR OSLC GUI
  • Use the Azureus package as an example
  • Open the package, demo progress indicator.
  • While it unpacks tell something about the UI
    design decisions
  • CLI for power users
  • GUI for everyone (Browser paradigm, tree for
    navigation, keep it uncluttered, overview for
    global results, treepreview panel for details)
  • Show file contents, click ChangeLog.txt, point
    out "not a source file"
  • Toggle overview, toggle word wrap
  • Demo scalability, drag the splitter around
  • Explain the overview panel
  • Explain the tree view
  • Icons with red indicators -gt conflicts
  • Surf to EntityHandler
  • Expand node, show conflicts
  • Double click EntityHandler
  • Point out conflicts
  • Click details
  • Double click AEMonitor, point out the
    "hyperlink"-behavior

18
DEMO SCRIPT FOR OSLC Command Line
  • Handles the same file formats as the GUI version
  • Easy to use
  • oslc2cli path to files
  • Configurable output
  • Found licenses
  • License conflicts
  • File dependencies (references)
  • Summary of results
  • Help screen!

19
Thank you very much ?
  • Questions or Comments?
Write a Comment
User Comments (0)
About PowerShow.com