MSE Presentation 3 - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

MSE Presentation 3

Description:

The worst case tests with JMeter also provided reasonable results to exemplify the fact that the application is highly robust. – PowerPoint PPT presentation

Number of Views:107
Avg rating:3.0/5.0
Slides: 30
Provided by: Pad57
Category:

less

Transcript and Presenter's Notes

Title: MSE Presentation 3


1
MSE Presentation 3
  • By
  • Padmaja Havaldar- Graduate Student
  • Under the guidance of
  • Dr. Daniel Andresen Major Advisor
  • Dr. Scott Deloach-Committee Member
  • Dr. William Hankley- Committee Member

2
Introduction
  • Overview
  • Revised Artifacts
  • Testing Evaluation
  • Project Evaluation
  • Problems Encountered
  • Lessons Learned
  • User Manual
  • Conclusion
  • Demonstration

3
Overview
  • Objective
  • To develop a web-based Statistical Analysis tool
    based on the statistics alumni information. The
    four types of tests used for analysis were
    Regression analysis, Correlation analysis,
    Hypothesis test and Chi-Square test.

4
Overview
5
Revised Artifacts
  • Object Model

6
Revised Artifacts
  • Object Model

7
Revised Artifacts
  • Formal Requirement Specification
  • The USE model was refined with the changes
    suggested during presentation 2

8
Components
  • J2EE Application Server
  • Enterprise Java Beans
  • Java Servlets
  • XML
  • HTML
  • Java Applets

9
Component Design
  • Servlet

10
Component Design
  • Entity Bean

11
Component Design
  • Session Beans

12
Testing Evaluation
  • Registration form
  • All the inputs to the fields in the form were
    tested.
  • Functionality of tests
  • Each test was tested to check its functionality
    by using test cases and also by checking the
    output obtained from the tool with that of Excel.
  • Some of the test cases for the tests are listed
    below
  • Regression test
  • Less than 3 members
  • No MS members
  • No PhD members

13
Testing Evaluation
  • Chi-Square
  • No Citizens
  • No International students
  • No person with a job in 3 months of graduation
  • No person without a job in 3 months of graduation
  • Hypothesis test
  • No MS alumni
  • No PhD alumni
  • Correlation
  • No members

14
Testing Evaluation
  • Testing using JMeter
  • Stress or performance test was conducted using
    JMeter based on the number of simultaneous users
    accessing the site
  • To check the results using JMeter, graphs were
    plotted as results.
  • Throughput is dependent upon many factors like
    network bandwidth, clogging of network and also
    the amount of data passed
  • The deviation is amount deviated. this should be
    as small as possible for best results.
  • The average defines the average time required to
    access the questions page.

15
Testing Evaluation
  • The values seem high because the data is passed
    to the bean and many calculations are performed
    on the data
  • The servlet uses the result to display the graphs
    as applets and also some tabular representations

16
Testing Evaluation
  • According to careful consideration it would be
    close to impossible to have more than 30
    simultaneous users with no lag between them so
    that tests were made with 15, 30 and 45 users
  • The time required looks higher than normal text
    web sites thus the total performance is best at
    low simultaneous users but high number of users
    deteriorates the performance.

10 Users/ second (optimal) 30Users/ second (average) 45 Users/ second (Worst)
Deviation 248 ms 559 ms 1542 ms
Throughput 755/min 981/min 824/min
Average 709 ms 1619 ms 2998 ms
17
Testing Evaluation
  • Testing using Microsoft Application Center Test
  • Test typeDynamic
  • Test duration00000500
  • Test iterations227
  • Total number of requests4,093
  • Total number of connections4,093
  • Average requests per second13.64
  • Average time to last byte (msecs)72.39
  • Number of unique requests made in test12
  • Number of unique response codes2
  • Errors Counts
  • HTTP0
  • DNS0
  • Socket0
  • Average bandwidth (bytes/sec)134,048.33
  • Number of bytes sent (bytes)1,434,357
  • Number of bytes received (bytes)38,780,141
  • Average rate of sent bytes (bytes/sec)4,781.19

18
Testing Evaluation
  • Scalability
  • Database
  • Oracle database is highly scalable. The number of
    users in the database does not affect the
    performance of the database because of the fact
    that the database has only one table of users.
  • The database is used to retrieve users from the
    table.
  • Application
  • Tests with 200 simultaneous users also provided
    reasonable results.
  • Average time for each user to access the
    questions page 5 seconds
  • Deviation was 2 seconds.
  • Portability
  • Since the J2EE architecture is based on the Java
    framework, the application can be used across
    many enterprise platforms.
  • Robustness
  • Using client side scripting and error checking
    within the middle tier, the application is more
    or less robust against invalid data.
  • The application has undergone many iterations of
    unit testing to finally culminate into a robust
    application.
  • The worst case tests with JMeter also provided
    reasonable results to exemplify the fact that the
    application is highly robust.

19
Formal Technical Inspection
  • Inspection of the SRS was conducted by
  • Laksmikanth Ghanti and
  • Divyajyana Nanjaiah
  • The inspection results specified that the SRS was
    99 satisfactory. Minor changes were corrected by
    adding a section for Anticipated future changes
    in version 2.0 of the SRS and making provision
    for additional error messages in the SRS
  • Results

20
User Manual
  • An installation guide and a detailed walkthrough
    of the project is provided in the user manual
  • User manual

21
Project Evaluation
  • Project Duration

Start Time Finish Time Expected Actual
Phase I 03/15/03 06/30/03
Phase II 07/01/03 7/28/03 10/27/03
Phase III 10/28/03 27/11/03 12/09/03
22
Project Evaluation
23
Project Evaluation
24
Project Evaluation
25
Project Evaluation
  • Lines of Code
  • Estimate in first phase 4636
  • Actual Lines of code
  • Entity Java Beans 1869
  • Servlet 1040
  • XML 120
  • Total 3029 lines of code.

26
Problems Encountered
  • Learning curve
  • J2EE and Deploy tool
  • Does not update files automatically
  • Not best suited for unit testing or development
    practices
  • EJB packaging errors
  • Alumni data

27
Lessons Learned
  • Methodology
  • Usefulness of methodologies
  • Reviews
  • The feedback during reviews was very helpful
  • Technology
  • J2EE architecture and deploy tool

28
Conclusion
  • SAT was implemented using the J2EE architecture
  • JMeter and Microsoft ACT was used to stress test
    the application and the performance was found to
    be satisfactory
  • The SAT is extensible

29
Demonstration
Write a Comment
User Comments (0)
About PowerShow.com